Skip to main content Accessibility help
×
Hostname: page-component-8448b6f56d-42gr6 Total loading time: 0 Render date: 2024-04-23T12:50:35.309Z Has data issue: false hasContentIssue false

6 - Message-passing Programming

Published online by Cambridge University Press:  06 January 2017

Zbigniew J. Czech
Affiliation:
Silesia University of Technology, Gliwice, Poland
Get access

Summary

INTRODUCTION

One of the types of parallel processing is distributed computing. This type of processing can be conducted in integrated computers with distributed memory, or in clusters, which are systems of homogeneous or heterogeneous networked computers. In distributed computing the tasks communicate via communication channels (or links). The channels form an interconnection network between processors or computers. Processors or computers that are vertices of a network perform computing tasks, as well as send and receive messages.

In this chapter we explore how to implement parallel programs that consist of tasks cooperating with each other using message passing. Parallel programs should be written in a suitable programming language. Probably the only language specially developed to describe parallel computing with message passing was occam. This language proposed by May et al. in Inmos company was based on the CSP notation (acronym for Communicating Sequential Processes) defined by Hoare. In the 1980s occam was used as the programming language for transputers—systems of large-scale integration, each combining a processor and four communication channels. Along with development of computer hardware, it turned out that occam due to certain weaknesses and restrictions is insufficient to describe distributed computing. Nowadays, these are often carried out using C or Fortran languages augmented by functions intended for cooperation of parallel processes. The most popular libraries of such functions are PVM (Parallel Virtual Machine) and MPI (Message Passing Interface).

The PVM library was developed in Oak Ridge National Laboratory. It permits the creation and execution of parallel programs in heterogeneous networks consisting of sequential and parallel computers. Another popular and largely universal library that is used to build distributed programs is MPI. It can be applied together with the OpenMP interface in computers with distributed memory (see Section 5.4.3), in particular in clusters composed of multicore processors or SMP nodes (see Sections 5.4.2 and 5.4.3, and Chapter 7). The library is highly portably enabling to build scalable programs for applications where achieving high computational performance is essential.

Type
Chapter
Information
Publisher: Cambridge University Press
Print publication year: 2017

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

Save book to Kindle

To save this book to your Kindle, first ensure coreplatform@cambridge.org is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

Available formats
×

Save book to Dropbox

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Dropbox.

Available formats
×

Save book to Google Drive

To save content items to your account, please confirm that you agree to abide by our usage policies. If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account. Find out more about saving content to Google Drive.

Available formats
×