Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to detect if all the rows of a non-square matrix are orthogonal in python

I can test the rank of a matrix using np.linalg.matrix_rank(A) . But how can I test if all the rows of A are orthogonal efficiently?

I could take all pairs of rows and compute the inner product between them but is there a better way?

My matrix has fewer rows than columns and the rows are not unit vectors.

like image 437
graffe Avatar asked May 06 '14 19:05

graffe


3 Answers

This answer basically summarizes the approaches mentioned in the question and the comments, and adds some comparison/insights about them


Approach #1 -- checking all row-pairs

As you suggested, you can iterate over all row pairs, and compute the inner product. If A.shape==(N,M), i.e. you have N rows of size M each, you end up with a O(M*N^2) complexity.

Approach #2 -- matrix multiplication

As suggested in the comments by @JoeKington, you can compute the multiplication A.dot(A.T), and check all the non-diagonal elements. Depending on the algorithm used for matrix multiplication, this can be faster than the naive O(M*N^2) algorithm, but only asymptotically better. Unless your matrices are big, they would be slower.


The advantages of approach #1:

  • You can "short circuit" -- quit the check as soon as you find the first non-orthogonal pair
  • requires less memory. In #2, you create a temporary NxN matrix.

The advantages of approach #2:

  • The multiplication is fast, as it is implemented in the heavily-optimized linear-algebra library (BLAS of ATLAS). I believe those libraries choose the right algorithm to use according to input size (i.e. they won't use the fancy algorithms on small matrices, because they are slower for small matrices. There's a big constant hidden behind that O-notation).
  • less code to write

My bet is that for small matrices, approach #2 would prove faster due to the fact the LA libraries are heavily optimized, and despite the fact they compute the entire multiplication, even after processing the first pair of non-orthogonal rows.

like image 103
shx2 Avatar answered Oct 11 '22 04:10

shx2


It seems that this will do

product = np.dot(A,A.T)
np.fill_diagonal(product,0)
if (product.any() == 0):
like image 37
graffe Avatar answered Oct 11 '22 04:10

graffe


Approach #3: Compute the QR decomposition of AT

In general, to find an orthogonal basis of the range space of some matrix X, one can compute the QR decomposition of this matrix (using Givens rotations or Householder reflectors). Q is an orthogonal matrix and R upper triangular. The columns of Q corresponding to non-zero diagonal entries of R form an orthonormal basis of the range space.

If the columns of X=AT, i.e., the rows of A, already are orthogonal, then the QR decomposition will necessarily have the R factor diagonal, where the diagonal entries are plus or minus the lengths of the columns of X resp. the rows of A.

Common folklore has it that this approach is numerically better behaved than the computation of the product A*AT=RT*R. This may only matter for larger matrices. The computation is not as straightforward as the matrix product, however, the amount of operations is of the same size.

like image 3
Lutz Lehmann Avatar answered Oct 11 '22 03:10

Lutz Lehmann