I have a non-diagonal matrix, but with the non-diagonal elements much smaller than the diagonal ones, so upon diagonalization, I expect that the diagonal elements will change, but just a little. I am currently just using numpy.linalg.eig
to get the eigenvalues, however I would like the order of these values to be the same as the (similar) values before diagonalization. So, if the diagonal in the original matrix was (100,200,300), after diagonalization I would like to have as output (101,202,303) and not (202,303,101). I need this in order to keep track of which value got map to which after diagonalization. How can I do this? It seems like there is no clear order to the output of numpy.linalg.eig
? Thank you!
I don't think there can be a rigorous solution to this problem, surely not using straight numpy.
From an admittedly very cursory examination of the source code, I believe that to calculate the eigenvalues numpy starts by the hallowed method of calculating the roots of the determinant of the parametric matrix. This is the fastest and most accurate method and is guaranteed to work.
Yet, it cannot preserve eigenvalue order.
For simplicity, consider a two by two matrix:
[[ 10 .1 ] [.1 20]]
this has lambda-det of L^2-30L+199.99
But if you consider this other matrix, with the same values in the opposite order, obtained by swapping the previous diagonal...
[[ 20 .1 ] [.1 10]]
...you end up with the same determinant. So, from that one determinant, you can't tell which matrix you started from, so you can not determine the original diagonal elements' order. Proceeding with the calculations, from here onwards, that information no longer exists. You do get the eigenvalue values, but not their original order.
When you build the determinant, just as if you multiplied x*y, the information about the order of the factors gets lost (in the case of multiplication, it gets completely lost, always).
So the eigenvalues appear to be "shuffled" in respect to the original diagonal.
But what actually happens is not that the original values get "slightly shifted"; they are, each time, all completely lost, and other similar values are recreated in their place. The fact that they "look like" the previous values is a red herring. So, there's no guarantee that a match will be possible.
It's sort of the mathematical version of the "vanishing illusion" puzzles like Teddy and the Lions:
What you can try to do is match the values "a posteriori", i.e. select the permutation of the eigenvalues that most closely approximates the original diagonal.
There are algorithms to do that, using different metrics to determine which permutation is "closer" (usually, least squares).
A quick way would be to sort the initial diagonal values in ascending order, record the permutation used (this is O(n log n)), then sort the eigenvalues in ascending order and apply the permutation in reverse. This gets you a "close match".
However, it might still be possible to do something akin to what you describe using a different algorithm, e.g. Jacobi's method or other iterative method.
Note that such algorithms needs must preserve the order locally. The value in a place remains there. But it still might change so that you get values in an order different from the one you might expect, e.g. you start with diagonal [2.1, 1.9, 12], the first element shrinks to 1.999, the second grows to 2.001, and you might be led to believe that "the two elements got swapped".
As I said, my linear algebra days are long past, but I remembered that Jacobi method was iterative and so might fit your bill. I looked for that and stopped at the first result. It looks correct to me. In case it wasn't, a more focused search might still yield a working Python implementation; the theory is proven.
GitHub page
a = array([[2.0,0.1,0.1],[0.1,7.0,0.1],[0.1, 0.1, 11.0]])
lam,x = jacobi(a,tol = 1.0e-9)
print(lam)
# Expected value: close to [2 7 11] in this order
[1.99693421 6.99940092 11.00366487]
A run of several other random values yielded consistent results.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With