I know that the lower bound of the multiplication of two full matrices is Ω(n^2). Matrix multiplication
I have been trying to prove that the lower bound of the multiplication of two lower triangular matrices using problem transformation method.
My initial thinking is that to (1) transform the lower triangular matrix, (2) estimate the time complexity of such transformation.
T(lower_triangular_matrix_multiplication(n))+O(lower_triangular_matrix_transformation(n))>Ω(full_matrix_multiplication(n)) = Ω(n^2)
Now, I only have to prove O(lower_triangular_matrix_transformation(n))
and I need to make triangular matrix to be a full matrix so I just let this triangular matrix be multiplied by a variation of itself, say transpose, for simplicity.
The reason is that the square of a lower triangular matrix is still a lower triangular matrix and a lower triangular matrix multiplied by its transposed variation is a "full matrix".
So I only need to analyze the complexity of a triangular matrix multiplied by its transposed variation.
Could anyone indicate if my thinking is "reasonable"?
This is a major open question in theoretical computer science. As of October 2022, the best announced bound on the asymptotic complexity of a matrix multiplication algorithm is O(n2.37188) time, given by Duan, Wu and Zhou announced in a preprint.
The product of two lower triangular matrices is a lower triangular matrix. As a consequence, the product of any number of lower triangular matrices is a lower triangular matrix. If all the factor matrices are unit diagonal, then the resulting matrix is also unit diagonal.
Fact . :The product of two upper triangular matrices is upper triangular. The product of two lower triangular matrices is lower triangular. . A triangular matrix is invertible if and only if its diagonal entries are non-zero.
A square matrix is called lower triangular if all the entries above the main diagonal are zero. Similarly, a square matrix is called upper triangular if all the entries below the main diagonal are zero.
I'm not convinced that by constructing an algorithm is sufficient proof for lower bound, as it still could not be excluded that there would exist a different algorithm with lower complexity.
It is clear that the lower bound will not be higher than O(n^2) as the general multiplication would always be applicable to lower_triangle_matrices (ltm).
Now, as any transformation of an arbitrary matrix to one ore more ltm is be itself an operation of O(n^2) complexity, we may not deduce that any such algorithm does not exist.
Your outline of reasoning seems to suggest that adding complexities is following "normal" arithmetical rules. This is not the case!
The resulting complexity always is (at least) the maximum of the complexities of the algorithms parts.
So your reasoning seems not to be sound.
You could try one of the following:
To me it looks as if the step from T()+O() > O(n^2) is not well grounded. And from this the conclusion to just have to proof (O(ltm)) is broken.
I was thinking that the solution might be to transform A into A+A'. Both the complexity of the operations of transpose and plus is O(n^2). So, O(lower_triangular_matrix_transformation(n))=n^2. Because the lower bound of AA and the one of (A+A')(A+A') are also Ω(n^2), T(lower_triangular_matrix_multiplication(n))>Ω(full_matrix_multiplication(n))-O(lower_triangular_matrix_transformation(n)), which means that the lower bound of triangular matrix and the one of full matrix are identical.
Q.E.D.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With