I am using scikit-learning to do some dimension reduce task. My training/test data is in the libsvm format. It is a large sparse matrix in half million columns.
I use load_svmlight_file function load the data, and by using SparsePCA, the scikit-learning throw out an exception of the input data error.
How to fix it?
Sparse PCA is an algorithm for finding a sparse decomposition (the components have a sparsity constraint) on dense data.
If you want to do vanilla PCA on sparse data you should use sklearn.decomposition.RandomizedPCA
that implements an scalable approximate method that works on both sparse and dense data.
IIRC sklearn.decomposition.PCA
only works on dense data at the moment. Support for sparse data could be added in the future by delegating the SVD computation on the sparse data matrix to arpack for instance.
Edit: as noted in the comments sparse input for RandomizedPCA
is deprecated: instead you should use sklearn.decomposition.TruncatedSVD
that does precisely what RandomizedPCA
used to do on sparse data but should not have been called PCA in the first place.
To clarify: PCA is mathematically defined as centering the data (removing the mean value to each feature) and then applying truncated SVD on the centered data.
As centering the data would destroy the sparsity and force a dense representation that often does not fit in memory any more, it is common to directly do truncated SVD on sparse data (without centering). This resembles PCA but it's not exactly the same. This is implemented in scikit-learn as sklearn.decomposition.TruncatedSVD
.
Edit (March 2019): There is ongoing work to implement PCA on sparse data with implicit centering: https://github.com/scikit-learn/scikit-learn/pull/12841
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With