Jump to content

Parallel programming: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
tighten, and move some stuff to Concurrent computing (where it is more relevant)
tweak
Line 1: Line 1:
'''Parallel programming''' is a [[computer]] [[programming]] technique that provides for the execution of operations in parallel, either within a single computer, or across a number of systems. The latter case is a form of [[distributed computing]], which is a method of information processing in which work is performed by separate computers linked through a communications network.
'''Parallel programming''' is a [[computer]] [[programming]] technique that provides for the execution of operations in parallel, either within a single computer, or across a number of systems. The latter case is a form of [[distributed computing]], which is a method of information processing in which work is performed by separate computers linked through a communications network.


Some people consider parallel programming to be synonymous with [[concurrent computing]]. Others draw a distinction between ''concurrency'' in general, which encompasses both parallel execution and interaction between parallel entities while they are executing, and ''parallel programming'', which involves parallelism ''without'' interaction between executing entities. The essence of parallel programming in the latter sense is to split a single task into a number of subtasks that can be computed relatively independently, and then aggregated to form a single coherent solution. The independence of the subtasks eliminates concerns involving [[Mutual exclusion|exclusive access to shared resources]], and ensuring [[Synchronization|synchronization]] between different subtasks, which can permit [[Multiprocessor|multiprocessor]] machines to achieve better performance than their single processor counterparts by executing independent tasks on separate processors. However, such parallel programming techniques are most effective for problems that can be easily broken down into independent tasks, such as purely mathematical problems (e.g. factorization). Unfortunately, not all algorithms can be optimized to run in a parallel setting, which can create performance issues in multiprocessor systems different from those encountered in single processor systems.
Some people consider parallel programming to be synonymous with [[Concurrent programming language|concurrent programming]]. Others draw a distinction between ''concurrent programming'' in general, which encompasses both parallel execution and interaction between parallel entities while they are executing, and ''parallel programming'', which involves parallelism ''without'' interaction between executing entities. The essence of parallel programming in the latter sense is to split a single task into a number of subtasks that can be computed relatively independently, and then aggregated to form a single coherent solution. The independence of the subtasks eliminates concerns involving [[Mutual exclusion|exclusive access to shared resources]], and ensuring [[Synchronization|synchronization]] between different subtasks, which can permit [[Multiprocessor|multiprocessor]] machines to achieve better performance than their single processor counterparts by executing independent tasks on separate processors. However, such parallel programming techniques are most effective for problems that can be easily broken down into independent tasks, such as purely mathematical problems (e.g. factorization). Unfortunately, not all algorithms can be optimized to run in a parallel setting, which can create performance issues in multiprocessor systems different from those encountered in single processor systems.


==See also==
==See also==

Revision as of 19:37, 13 January 2006

Parallel programming is a computer programming technique that provides for the execution of operations in parallel, either within a single computer, or across a number of systems. The latter case is a form of distributed computing, which is a method of information processing in which work is performed by separate computers linked through a communications network.

Some people consider parallel programming to be synonymous with concurrent programming. Others draw a distinction between concurrent programming in general, which encompasses both parallel execution and interaction between parallel entities while they are executing, and parallel programming, which involves parallelism without interaction between executing entities. The essence of parallel programming in the latter sense is to split a single task into a number of subtasks that can be computed relatively independently, and then aggregated to form a single coherent solution. The independence of the subtasks eliminates concerns involving exclusive access to shared resources, and ensuring synchronization between different subtasks, which can permit multiprocessor machines to achieve better performance than their single processor counterparts by executing independent tasks on separate processors. However, such parallel programming techniques are most effective for problems that can be easily broken down into independent tasks, such as purely mathematical problems (e.g. factorization). Unfortunately, not all algorithms can be optimized to run in a parallel setting, which can create performance issues in multiprocessor systems different from those encountered in single processor systems.

See also