I'm trying to write a program that gets a matrix A
of any size, and SVD decomposes it:
A = U * S * V'
Where A
is the matrix the user enters, U
is an orthogonal matrix composes of the eigenvectors of A * A'
, S
is a diagonal matrix of the singular values, and V
is an orthogonal matrix of the eigenvectors of A' * A
.
Problem is: the MATLAB function eig
sometimes returns the wrong eigenvectors.
This is my code:
function [U,S,V]=badsvd(A)
W=A*A';
[U,S]=eig(W);
max=0;
for i=1:size(W,1) %%sort
for j=i:size(W,1)
if(S(j,j)>max)
max=S(j,j);
temp_index=j;
end
end
max=0;
temp=S(temp_index,temp_index);
S(temp_index,temp_index)=S(i,i);
S(i,i)=temp;
temp=U(:,temp_index);
U(:,temp_index)=U(:,i);
U(:,i)=temp;
end
W=A'*A;
[V,s]=eig(W);
max=0;
for i=1:size(W,1) %%sort
for j=i:size(W,1)
if(s(j,j)>max)
max=s(j,j);
temp_index=j;
end
end
max=0;
temp=s(temp_index,temp_index);
s(temp_index,temp_index)=s(i,i);
s(i,i)=temp;
temp=V(:,temp_index);
V(:,temp_index)=V(:,i);
V(:,i)=temp;
end
s=sqrt(s);
end
My code returns the correct s
matrix, and also "nearly" correct U
and V
matrices. But some of the columns are multiplied by -1. obviously if t
is an eigenvector, then also -t
is an eigenvector, but with the signs inverted (for some of the columns, not all) I don't get A = U * S * V'
.
Is there any way to fix this?
Example: for the matrix A=[1,2;3,4]
my function returns:
U=[0.4046,-0.9145;0.9145,0.4046]
and the built-in MATLAB svd
function returns:
u=[-0.4046,-0.9145;-0.9145,0.4046]
Note that eigenvectors are not unique. Multiplying by any constant, including -1
(which simply changes the sign), gives another valid eigenvector. This is clear given the definition of an eigenvector:
A·v = λ·v
MATLAB chooses to normalize the eigenvectors to have a norm of 1.0, the sign is arbitrary:
For
eig(A)
, the eigenvectors are scaled so that the norm of each is 1.0. Foreig(A,B)
,eig(A,'nobalance')
, andeig(A,B,flag)
, the eigenvectors are not normalized
Now as you know, SVD and eigendecomposition are related. Below is some code to test this fact. Note that svd
and eig
return results in different order (one sorted high to low, the other in reverse):
% some random matrix
A = rand(5);
% singular value decomposition
[U,S,V] = svd(A);
% eigenvectors of A'*A are the same as the right-singular vectors
[V2,D2] = eig(A'*A);
[D2,ord] = sort(diag(D2), 'descend');
S2 = diag(sqrt(D2));
V2 = V2(:,ord);
% eigenvectors of A*A' are the same as the left-singular vectors
[U2,D2] = eig(A*A');
[D2,ord] = sort(diag(D2), 'descend');
S3 = diag(sqrt(D2));
U2 = U2(:,ord);
% check results
A
U*S*V'
U2*S2*V2'
I get very similar results (ignoring minor floating-point errors):
>> norm(A - U*S*V')
ans =
7.5771e-16
>> norm(A - U2*S2*V2')
ans =
3.2841e-14
To get consistent results, one usually adopts a convention of requiring that the first element in each eigenvector be of a certain sign. That way if you get an eigenvector that does not follow this rule, you multiply it by -1
to flip the sign...
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With