Jump to content

Strassen algorithm

From Wikipedia, the free encyclopedia
This is an old revision of this page, as edited by 130.254.204.213 (talk) at 23:21, 31 January 2007 (Algorithm). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

In the mathematical discipline of linear algebra, the Strassen algorithm, named after Volker Strassen, is an algorithm used for matrix multiplication. It is asymptotically faster than the standard matrix multiplication algorithm, but slower than the fastest known algorithm.

History

Volker Strassen published the Strassen algorithm in 1969. Although his algorithm is only slightly faster than the standard algorithm for matrix multiplication, he was the first to point out that Gaussian elimination is not optimal. His paper started the search for even faster algorithms such as the Winograd algorithm in 1980 (which uses 7 binary multiplications, but 15 binary additions instead of 18 with the Strassen algorithm), and the more complex Coppersmith–Winograd algorithm published in 1987.

Algorithm

Let A, B be two square matrices over a field F. We want to calculate the matrix product C as

If the matrices A, B are not of type 2n x 2n we fill the missing rows and columns with zeros.

We partition A, B and C into equally sized block matrices

with

then

With this construction we have not reduced the number of multiplications. We still need 8 multiplications to calculate the Ci,j matrices, the same number of multiplications we need when using standard matrix multiplication.

Now comes the important part. We define new matrices

which are then used to express the Ci,j in terms of Mk. Because of our definition of the Mk we can eliminate one matrix multiplication and reduce the number of multiplications to 7 (one multiplications for each Mk) and express the Ci,j as

We iterate this division process n-times until the submatrices degenerate into numbers.

Practical implementations of Strassen's algorithm switch to standard methods of matrix multiplication for small enough submatrices, for which they are more efficient. A popular misconception is that Strassen's algorithm outperforms the classic multiplication approach only for large matrix sizes. This is not true for modern architectures due to Strassen algorithm's superior cache behavior. For example, on a 2GHz Core Duo processor, Strassen's algorithm is three times faster at multiplying 128x128 matrices. It's time is pretty much the same as that block-multiplication for 256x256 and 512x512 matrices. For 1024x1024 and larger matrices (when the entire problem set no longer fits into processor's L2 cache), Strassen's algorithm is 3 times faster than block-multiplication and 12 times as fast as the classical three nested-loop implementation.

Numerical analysis

The standard matrix multiplications takes

multiplications of the elements in the field F. We ignore the additions needed because, depending on F, they can be much faster than the multiplications in computer implementations, especially if the sizes of the matrix entries exceed the word size of the machine.

With the Strassen algorithm we can reduce the number of multiplications to

.

The reduction in the number of multiplications however comes at the price at a somewhat reduced numeric stability.

  • Weisstein, Eric W. "Strassen's Formulas". MathWorld. (also includes formulas for fast matrix inversion)

References