Jump to content

Parallel programming

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by 150.217.14.163 (talk) at 11:57, 28 October 2004 (See also). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

Parallel programming (also concurrent programming), is a computer programming technique that provides for the execution of operations concurrently, either within a single computer, or across a number of systems. In the latter case, the term distributed computing is used. Multiprocessor machines achieve better performance by taking advantage of this kind of programming.

In parallel programming, single tasks are split into a number of subtasks that can be computed relatively independently and then aggregated to form a single coherent solution. Parallel programming is most effective for tasks that can easily broken down into independent tasks such as purely mathematical problems, e.g. factorisation.

One way to achieve parallel programming is through distributed computing, which is a method of information processing in which work is performed by separate computers linked through a communications network.

Pioneers in the field of concurrent programming include Edsger Dijkstra and C. A. R. Hoare.

See also