Using the same approach as of Strassen's only 5 multiplications are sufficient to compute square of a matrix. If A[2][2] = [a, b, c, d], the multiplications are a * a, d * d, b * (a + d), c * (a + d), b * c.
If we generalise this algorithm for getting the square of a matrix, the complexity reduces to n^log5 with base 2.
I was asked a question to find what is wrong with this algorithm and when it fails if we generalise this algorithm to find the square of matrix?
You can get by with only 5 multiplications at the root of the call tree, but some of these multiplications are not squares, so the running time is no better than Strassen for multiplication.
Put another way, if we had an O(n^c) algorithm for squaring an n-by-n matrix, then we'd get an O(n^c) algorithm for multiplication by squaring the 2n-by-2n block matrix
2
[0 A] [AB 0 ]
[B 0] = [0 BA].
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With