Jump to content

Block swap algorithms: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
m rm cat improve
m lc common nouns, wikiformatting
 
Line 1: Line 1:
{{Context|date=July 2019}}
{{Context|date=July 2019}}


In computer [[algorithm]]s, '''Block swap algorithms''' swap two regions of elements of an [[Array data structure|array]]. It is simple to swap two non-overlapping regions of an [[array data structure|array]] of equal size. However, it is not simple to swap two non-overlapping regions of an array in-place that are next to each other, but are of unequal sizes (such swapping is equivalent to '''Array Rotation'''). Three algorithms are known to accomplish this: Bentley's Juggling (also known as ''Dolphin Algorithm'' <ref>D. Gries, H. Mills (1981), [https://hdl.handle.net/1813/6292 Swapping Sections]</ref>), Gries-Mills, and Reversal.<ref>Jon Bentley, "Programming Pearls", pp. 13–15, 209-211.</ref> All three algorithms are linear time [[Big O notation|
In computer [[algorithm]]s, '''block swap algorithms''' swap two regions of elements of an [[Array data structure|array]]. It is simple to swap two non-overlapping regions of an [[array data structure|array]] of equal size. However, it is not simple to swap two non-overlapping regions of an array in-place that are next to each other, but are of unequal sizes (such swapping is equivalent to '''array rotation'''). Three algorithms are known to accomplish this: ''Bentley's juggling'' (also known as ''dolphin algorithm'' <ref>D. Gries, H. Mills (1981), [https://hdl.handle.net/1813/6292 Swapping Sections]</ref>), ''Gries-Mills'', and ''reversal algorithm''.<ref>Jon Bentley, "Programming Pearls", pp. 13–15, 209-211.</ref> All three algorithms are linear time [[Big O notation|
O(n)]], (see [[Time complexity]]).
O(n)]], (see [[Time complexity]]).


Line 10: Line 10:
* Rotate region AB
* Rotate region AB


Gries-Mills and Reversal algorithms perform better than Bentley's Juggling, because of their [[Cache (computing)|cache]]-friendly [[memory access pattern]] behavior.
Gries-Mills and reversal algorithms perform better than Bentley's juggling, because of their [[Cache (computing)|cache]]-friendly [[memory access pattern]] behavior.


The Reversal algorithm [[Parallel computing|parallelizes]] well, because rotations can be split into sub-regions, which can be rotated independently of others.
The reversal algorithm [[Parallel computing|parallelizes]] well, because rotations can be split into sub-regions, which can be rotated independently of others.


==References==
==References==

Latest revision as of 17:41, 31 October 2024

In computer algorithms, block swap algorithms swap two regions of elements of an array. It is simple to swap two non-overlapping regions of an array of equal size. However, it is not simple to swap two non-overlapping regions of an array in-place that are next to each other, but are of unequal sizes (such swapping is equivalent to array rotation). Three algorithms are known to accomplish this: Bentley's juggling (also known as dolphin algorithm [1]), Gries-Mills, and reversal algorithm.[2] All three algorithms are linear time O(n), (see Time complexity).

Reversal algorithm

[edit]

The reversal algorithm is the simplest to explain, using rotations. A rotation is an in-place reversal of array elements. This method swaps two elements of an array from outside in within a range. The rotation works for an even or odd number of array elements. The reversal algorithm uses three in-place rotations to accomplish an in-place block swap:

  • Rotate region A
  • Rotate region B
  • Rotate region AB

Gries-Mills and reversal algorithms perform better than Bentley's juggling, because of their cache-friendly memory access pattern behavior.

The reversal algorithm parallelizes well, because rotations can be split into sub-regions, which can be rotated independently of others.

References

[edit]
  1. ^ D. Gries, H. Mills (1981), Swapping Sections
  2. ^ Jon Bentley, "Programming Pearls", pp. 13–15, 209-211.