code :
import numpy
from matplotlib.mlab import PCA
file_name = "store1_pca_matrix.txt"
ori_data = numpy.loadtxt(file_name,dtype='float', comments='#', delimiter=None, converters=None, skiprows=0, usecols=None, unpack=False, ndmin=0)
result = PCA(ori_data)
this is my code. though my input matrix is devoid of the nan and inf, i do get the error stated below.
raise LinAlgError("SVD did not converge") LinAlgError: SVD did not converge
what's the problem?
This can happen when there are inf or nan values in the data.
Use this to remove nan values:
ori_data.dropna(inplace=True)
I know this post is old, but in case someone else encounters the same problem. @jseabold was right when he said that the problem is nan or inf and the op was probably right when he said that the data did not have nan's or inf. However, if one of the columns in ori_data has always the same value, the data will get Nans, since the implementation of PCA in mlab normalizes the input data by doing
ori_data = (ori_data - mean(ori_data)) / std(ori_data).
The solution is to do:
result = PCA(ori_data, standardize=False)
In this way, only the mean will be subtracted without dividing by the standard deviation.
If there are no inf or NaN values, possibly that is a memory issue. Please try in a machine with higher RAM.
I do not have an answer to this question but I have the reproduction scenario with no nans and infs. Unfortunately the datataset is pretty large (96MB gzipped).
import numpy as np
from StringIO import StringIO
from scipy import linalg
import urllib2
import gzip
url = 'http://physics.muni.cz/~vazny/gauss/X.gz'
X = np.loadtxt(gzip.GzipFile(fileobj=StringIO(urllib2.urlopen(url).read())), delimiter=',')
linalg.svd(X, full_matrices=False)
which rise:
LinAlgError: SVD did not converge
on:
>>> np.__version__
'1.8.1'
>>> import scipy
>>> scipy.__version__
'0.10.1'
but did not raise an exception on:
>>> np.__version__
'1.8.2'
>>> import scipy
>>> scipy.__version__
'0.14.0'
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With