Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

What is SVD(singular value decomposition)

People also ask

What is singular value decomposition SVD explain with example?

The Singular Value Decomposition (SVD) of a matrix is a factorization of that matrix into three matrices. It has some interesting algebraic properties and conveys important geometrical and theoretical insights about linear transformations. It also has some important applications in data science.

What is singular value SVD?

The singular values are the diagonal entries of the S matrix and are arranged in descending order. The singular values are always real numbers. If the matrix A is a real matrix, then U and V are also real.

Why is SVD called singular value decomposition?

Show activity on this post. The SVD stands for Singular Value Decomposition. After decomposing a data matrix X using SVD, it results in three matrices, two matrices with the singular vectors U and V, and one singular value matrix whose diagonal elements are the singular values.

What is singular value decomposition used for?

Singular Value Decomposition (SVD) is a widely used technique to decompose a matrix into several component matrices, exposing many of the useful and interesting properties of the original matrix.


SVD can be understood from a geometric sense for square matrices as a transformation on a vector.

Consider a square n x n matrix M multiplying a vector v to produce an output vector w:

w = M*v

The singular value decomposition M is the product of three matrices M=U*S*V, so w=U*S*V*v. U and V are orthonormal matrices. From a geometric transformation point of view (acting upon a vector by multiplying it), they are combinations of rotations and reflections that do not change the length of the vector they are multiplying. S is a diagonal matrix which represents scaling or squashing with different scaling factors (the diagonal terms) along each of the n axes.

So the effect of left-multiplying a vector v by a matrix M is to rotate/reflect v by M's orthonormal factor V, then scale/squash the result by a diagonal factor S, then rotate/reflect the result by M's orthonormal factor U.

One reason SVD is desirable from a numerical standpoint is that multiplication by orthonormal matrices is an invertible and extremely stable operation (condition number is 1). SVD captures any ill-conditioned-ness in the diagonal scaling matrix S.


One way to use SVD to reduce noise is to do the decomposition, set components that are near zero to be exactly zero, then re-compose.

Here's an online tutorial on SVD.

You might want to take a look at Numerical Recipes.


Singular value decomposition is a method for taking an nxm matrix M and "decomposing" it into three matrices such that M=USV. S is a diagonal square (the only nonzero entries are on the diagonal from top-left to bottom-right) matrix containing the "singular values" of M. U and V are orthogonal, which leads to the geometric understanding of SVD, but that isn't necessary for noise reduction.

With M=USV, we still have the original matrix M with all its noise intact. However, if we only keep the k largest singular values (which is easy, since many SVD algorithms compute a decomposition where the entries of S are sorted in nonincreasing order), then we have an approximation of the original matrix. This works because we assume that the small values are the noise, and that the more significant patterns in the data will be expressed through the vectors associated with larger singular values.

In fact, the resulting approximation is the most accurate rank-k approximation of the original matrix (has the least squared error).


To answer to the tittle question: SVD is a generalization of eigenvalues/eigenvectors to non-square matrices. Say, $X \in N \times p$, then the SVD decomposition of X yields X=UDV^T where D is diagonal and U and V are orthogonal matrices. Now X^TX is a square matrice, and the SVD decomposition of X^TX=VD^2V where V is equivalent to the eigenvectors of X^TX and D^2 contains the eigenvalues of X^TX.