I suddenly can't load newly upgraded modules modules, e.g scikit-learn, zope, but I can find other packages. Even though the path links from the import points to the correct anaconda folder, which contains all the code. Any ideas what might be wrong and how to fix it?
Python 2.7.13 |Anaconda custom (64-bit)| (default, Dec 20 2016, 23:09:15)
[GCC 4.4.7 20120313 (Red Hat 4.4.7-1)] on linux2
>>> import sklearn
>>> from os import listdir
>>> print(dir(sklearn))
['_ASSUME_FINITE', '__SKLEARN_SETUP__', '__all__', '__builtins__', '__check_build', '__doc__', '__file__', '__name__', '__package__', '__path__', '__version__', '_contextmanager', 'base', 'clone', 'config_context', 'exceptions', 'externals', 'get_config', 'logger', 'logging', 'os', 're', 'set_config', 'setup_module', 'sys', 'utils', 'warnings']
>>> print(listdir(sklearn.__path__[0]))
['exceptions.py', 'cross_validation.pyc', 'lda.py', 'naive_bayes.pyc', 'isotonic.py', '_build_utils', 'neighbors', 'cluster', 'naive_bayes.py', '__init__.pyc', 'multiclass.py', 'dummy.pyc', 'grid_search.pyc', 'tests', '__init__.py', 'calibration.py', '_isotonic.so', 'neural_network', 'datasets', 'preprocessing', '__check_build', 'random_projection.py', 'multiclass.pyc', 'model_selection', 'calibration.pyc', 'pipeline.pyc', 'qda.py', 'learning_curve.py', 'ensemble', 'tree', 'isotonic.pyc', 'kernel_ridge.py', 'gaussian_process', 'decomposition', 'base.pyc', 'dummy.py', 'utils', 'pipeline.py', 'cross_decomposition', 'covariance', 'qda.pyc', 'multioutput.pyc', 'lda.pyc', 'feature_selection', 'linear_model', 'metrics', 'kernel_ridge.pyc', 'setup.py', 'semi_supervised', 'exceptions.pyc', 'multioutput.py', 'cross_validation.py', 'discriminant_analysis.py', 'kernel_approximation.pyc', 'base.py', 'random_projection.pyc', 'setup.pyc', 'kernel_approximation.py', 'grid_search.py', 'discriminant_analysis.pyc', 'mixture', 'manifold', 'externals', 'svm', 'feature_extraction', 'learning_curve.pyc']
>>> import zope
>>> print(dir(zope))
['__doc__', '__name__', '__path__']
>>> print(listdir(zope.__path__[0]))
['interface']
>>> zope.interface
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
AttributeError: 'module' object has no attribute 'interface'
>>> sklearn.lda
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
AttributeError: 'module' object has no attribute 'lda'
sadly but true... your error is real..but the below warning message as well. Use sklearn.discriminant_analysis.LinearDiscriminantAnalysis
from now on. How to deal with the other possible persisting error is also updated into my answer. Enjoy!
import warnings from .discriminant_analysis import LinearDiscriminantAnalysis as _LDA warnings.warn("lda.LDA has been moved to " "discriminant_analysis.LinearDiscriminantAnalysis " "in 0.17 and will be removed in 0.19", DeprecationWarning) class LDA(_LDA): """ Alias for :class:`sklearn.discriminant_analysis.LinearDiscriminantAnalysis`. .. deprecated:: 0.17 This class will be removed in 0.19. Use :class:`sklearn.discriminant_analysis.LinearDiscriminantAnalysis` instead. """ pass
No errors when running below code:
import sklearn
from sklearn import discriminant_analysis
from os import listdir
print(dir(sklearn))
print(listdir(sklearn.__path__[0]))
print discriminant_analysis.LinearDiscriminantAnalysis()
snippet output:
[Assume_FINITE', ...]
['base.py', ...]
LinearDiscriminantAnalysis(n_components=None, priors=None, shrinkage=None, solver='svd', store_covariance=False, tol=0.0001)
As it seems your code goes wrong somewhere and somehow. Your best option for now is:
1 - delete your scikit-image, sklearn and zope folders from folder ../site-packages;
2 - delete folders + content of scikit_image-0.13.0-py2.7.egg-info, scikit_learn-0.19.1-py2.7.egg-info and zope installer folder into the trashbin;
3 - delete
folders from trashbin (prevents linkage to folders within the trashbin);
4 -run pip install scikit-image/sklearn/zope with option --no-cache-dir
;
5 Voila... you get a running skimage, sklearn and zope again.
Enjoy!
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With