Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Difference between TruncatedSVD and svds

I see that the documentation for both sklearn.decomposition.TruncatedSVD and scipy.sparse.linalg.svds mention that they both perform SVD for sparse matrices. What is the difference between them?

Thanks.

like image 826
abhinavkulkarni Avatar asked Sep 09 '13 20:09

abhinavkulkarni


2 Answers

TruncatedSVD is more feature-rich. It has the scikit-learn API, so you can put it in a sklearn.Pipeline object and call transform on a new matrix instead of having to figure out the matrix multiplications yourself. It offers two algorithms: either a fast randomized SVD solver (the default), or scipy.sparse.svds.

(Full disclosure: I wrote TruncatedSVD.)

like image 78
Fred Foo Avatar answered Nov 16 '22 08:11

Fred Foo


There are doing the same

#check how to use TRuncatedSVD
X=[[1,2,3],[1,4,2],[4,1,7],[5,6,8]]

# TRUNCATED SVD
from sklearn.decomposition import TruncatedSVD
svd = TruncatedSVD(n_components=2, n_iter=7, random_state=42)

US=svd.fit_transform(X)

V=svd.components_
S=svd.singular_values_ 
print('u,s,v', US,S,V)
print('X_restored dot way',np.round(np.dot(US,V),1),'svdinverse way',np.round(svd.inverse_transform(U),1))

# LINALG SVD

U1,S1,V1=np.linalg.svd(X)

print('u1,s1,v1 remark negative mirrored',U1[:,:2]*S1[:2],V1[:2,:])
print('X restored u1,s1,v1, 2 components',np.round( np.dot( U1[:,:2]*S1[:2],V1[:2,:] ),1 ) ) 

# sparse svd
from scipy.sparse import csc_matrix
from scipy.sparse.linalg import svds, eigs
A = csc_matrix(X, dtype=float)

u2, s2, vt2 = svds(A, k=2)

print('sparse reverses !',u2*s2,vt2)
print('x restored',np.round( np.dot(u2*s2,vt2),1) )

RESULT

    u,s,v [[ 3.66997034 -0.34754761]
 [ 3.82764223 -2.51681397]
 [ 7.61154768  2.83860088]
 [11.13470337 -0.96070751]] [14.49264657  3.92883644] [[ 0.44571865  0.46215842  0.76664495]
 [ 0.23882889 -0.88677195  0.39572247]]
X_restored dot way
 [[1.6 2.  2.7]
 [1.1 4.  1.9]
 [4.1 1.  7. ]
 [4.7 6.  8.2]]
svdinverse way
 [[1.6 2.  2.7]
 [1.1 4.  1.9]
 [4.1 1.  7. ]
 [4.7 6.  8.2]]
u1,s1,v1 remark negative mirrored
 [[ -3.66997034   0.34754761]
 [ -3.82764223   2.51681397]
 [ -7.61154768  -2.83860088]
 [-11.13470337   0.96070751]] [[-0.44571865 -0.46215842 -0.76664495]
 [-0.23882889  0.88677195 -0.39572247]]
X restored u1,s1,v1, 2 components
 [[1.6 2.  2.7]
 [1.1 4.  1.9]
 [4.1 1.  7. ]
 [4.7 6.  8.2]]
sparse reverses !
 [[-0.34754761  3.66997034]
 [-2.51681397  3.82764223]
 [ 2.83860088  7.61154768]
 [-0.96070751 11.13470337]]
 [[ 0.23882889 -0.88677195  0.39572247]
 [ 0.44571865  0.46215842  0.76664495]]
x restored
 [[1.6 2.  2.7]
 [1.1 4.  1.9]
 [4.1 1.  7. ]
 [4.7 6.  8.2]]
[[-0.25322982  0.0884607   0.88223679]
 [-0.26410926  0.64060034  0.16752502]
 [-0.52520067 -0.72250421  0.11259767]
 [-0.76830021  0.24452723 -0.42534148]]
 [14.49264657  3.92883644  0.72625043]
 [[-0.44571865 -0.46215842 -0.76664495]
 [-0.23882889  0.88677195 -0.39572247]
 [-0.86272571 -0.00671608  0.50562757]]
 [[1. 2. 3.]
 [1. 4. 2.]
 [4. 1. 7.]
 [5. 6. 8.]]
like image 37
paul larmuseau Avatar answered Nov 16 '22 07:11

paul larmuseau