(*) ISSUE: I am trying to install GraphLab Create on MacOS for my coursera class. The installer fails when trying to create a new conda environment.
(*) ATTEMPTED SOLUTIONS:
(1) Installed packages listed on the error message for the current version of conda with: ./conda install conda=4.0
(2) Updated conda itself, when I ran the GrapLab installer it DOWNGRADED to 4.0.8
(*) ERROR MESSAGE:
Did not find a broken "gl-env" environment.
Creating conda environment "gl-env".
Using Anaconda Cloud api site https://api.anaconda.org
Fetching package metadata:
Solving package specifications:
Error: Dependencies missing in current osx-64 channels:
- anaconda 4.0|4.0.0* -> scipy 0.17.0 np110py27_0 -> blas * mkl
- anaconda 4.0|4.0.0* -> pytables 3.2.2 np110py27_1 -> numexpr 2.5* -> blas * mkl
- anaconda 4.0|4.0.0* -> numpy 1.10.4 py27_0 -> blas * mkl
- anaconda 4.0|4.0.0* -> pytables 3.2.2 np110py35_1 -> numexpr 2.5* -> blas * openblas
- anaconda 4.0|4.0.0* -> pytables 3.2.2 np110py27_1 -> numexpr 2.5* -> numpy 1.11* -> blas * openblas
- anaconda 4.0|4.0.0* -> numexpr 2.5 np110py27_0 -> blas * mkl
- anaconda 4.0|4.0.0* -> pytables 3.2.2 np110py27_1 -> numexpr 2.5* -> blas * openblas
- anaconda 4.0|4.0.0* -> pytables 3.2.2 np110py34_1 -> numexpr 2.5* -> numpy 1.11* -> blas * openblas
- anaconda 4.0|4.0.0* -> pytables 3.2.2 np110py34_1 -> numexpr 2.5* -> numpy 1.11* -> blas * mkl
- anaconda 4.0|4.0.0* -> pytables 3.2.2 np110py34_1 -> numexpr 2.5* -> blas * openblas
- anaconda 4.0|4.0.0* -> pytables 3.2.2 np110py35_1 -> numexpr 2.5* -> numpy 1.11* -> blas * openblas
- anaconda 4.0|4.0.0* -> pytables 3.2.2 np110py27_1 -> numexpr 2.5* -> numpy 1.11* -> blas * mkl
- anaconda 4.0|4.0.0* -> pytables 3.2.2 np110py35_1 -> numexpr 2.5* -> numpy 1.11* -> blas * mkl
- anaconda 4.0|4.0.0* -> numpy 1.10.4 py34_0 -> blas * mkl
- anaconda 4.0|4.0.0* -> pytables 3.2.2 np110py34_1 -> numexpr 2.5* -> blas * mkl
- anaconda 4.0|4.0.0* -> numpy 1.10.4 py35_0 -> blas * mkl
- anaconda 4.0|4.0.0* -> scipy 0.17.0 np110py34_0 -> blas * mkl
- anaconda 4.0|4.0.0* -> numexpr 2.5 np110py35_0 -> blas * mkl
- anaconda 4.0|4.0.0* -> scipy 0.17.0 np110py35_0 -> blas * mkl
- anaconda 4.0|4.0.0* -> pytables 3.2.2 np110py35_1 -> numexpr 2.5* -> blas * mkl
- anaconda 4.0|4.0.0* -> numexpr 2.5 np110py34_0 -> blas * mkl
- anaconda 4.0|4.0.0* -> scikit-learn 0.17.1 np110py35_0 -> blas * mkl
- anaconda 4.0|4.0.0* -> scikit-learn 0.17.1 np110py27_0 -> blas * mkl
- anaconda 4.0|4.0.0* -> scikit-learn 0.17.1 np110py34_0 -> blas * mkl
You can search for this package on anaconda.org with
anaconda search -t conda blas * mkl
(and similarly for the other packages)
(*) CONDA ENVIRONMENT:
platform : osx-64
conda version : 4.0.8
conda-build version : 0+unknown
python version : 2.7.13.final.0
requests version : 2.9.1
channel URLs : https://repo.continuum.io/pkgs/free/osx-64/
https://repo.continuum.io/pkgs/free/noarch/
https://repo.continuum.io/pkgs/pro/osx-64/
https://repo.continuum.io/pkgs/pro/noarch/
config file : None
is foreign system : False
(*) INSTALLED PACKAGE VERSIONS:
pip 8.1.1
pip list:
alabaster (0.7.7)
anaconda-client (1.4.0)
anaconda-navigator (1.1.0)
appnope (0.1.0)
appscript (1.0.1)
argcomplete (1.0.0)
astropy (1.1.2)
Babel (2.2.0)
backports-abc (0.4)
backports.ssl-match-hostname (3.4.0.2)
beautifulsoup4 (4.4.1)
bitarray (0.8.1)
blaze (0.9.1)
bokeh (0.11.1)
boto (2.39.0)
Bottleneck (1.0.0)
cdecimal (2.3)
cffi (1.5.2)
chest (0.2.3)
cloudpickle (0.1.1)
clyent (1.2.1)
colorama (0.3.7)
conda (4.0.8)
conda-build (0+unknown)
conda-env (2.4.5)
conda-manager (0.3.1)
configobj (5.0.6)
cryptography (1.3)
cycler (0.10.0)
Cython (0.23.4)
cytoolz (0.7.5)
dask (0.8.1)
datashape (0.5.1)
decorator (4.0.9)
dill (0.2.4)
docutils (0.12)
dynd (0.7.3.dev1)
enum34 (1.1.2)
et-xmlfile (1.0.1)
fastcache (1.0.2)
Flask (0.10.1)
Flask-Cors (2.1.2)
funcsigs (0.4)
futures (3.0.3)
gevent (1.1.0)
greenlet (0.4.9)
grin (1.2.1)
h5py (2.5.0)
HeapDict (1.0.0)
idna (2.0)
ipaddress (1.0.14)
ipykernel (4.3.1)
ipython (4.1.2)
ipython-genutils (0.1.0)
ipywidgets (4.1.1)
itsdangerous (0.24)
jdcal (1.2)
jedi (0.9.0)
Jinja2 (2.8)
jsonschema (2.4.0)
jupyter (1.0.0)
jupyter-client (4.2.2)
jupyter-console (4.1.1)
jupyter-core (4.1.0)
llvmlite (0.9.0)
locket (0.2.0)
lxml (3.6.0)
MarkupSafe (0.23)
matplotlib (1.5.1)
mistune (0.7.2)
mpmath (0.19)
multipledispatch (0.4.8)
nbconvert (4.1.0)
nbformat (4.0.1)
networkx (1.11)
nltk (3.2)
nose (1.3.7)
notebook (4.1.0)
numba (0.24.0)
numexpr (2.5)
numpy (1.10.4)
odo (0.4.2)
openpyxl (2.3.2)
pandas (0.18.0)
partd (0.3.2)
path.py (0.0.0)
patsy (0.4.0)
pep8 (1.7.0)
pexpect (4.0.1)
pickleshare (0.5)
Pillow (3.1.1)
pip (8.1.1)
ply (3.8)
psutil (4.1.0)
ptyprocess (0.5)
py (1.4.31)
pyasn1 (0.1.9)
PyAudio (0.2.7)
pycosat (0.6.1)
pycparser (2.14)
pycrypto (2.6.1)
pycurl (7.19.5.3)
pyflakes (1.1.0)
Pygments (2.1.1)
pyOpenSSL (0.15.1)
pyparsing (2.0.3)
pytest (2.8.5)
python-dateutil (2.5.1)
pytz (2016.2)
PyYAML (3.11)
pyzmq (15.2.0)
QtAwesome (0.3.2)
qtconsole (4.2.0)
QtPy (1.0)
redis (2.10.3)
requests (2.9.1)
rope (0.9.4)
scikit-image (0.12.3)
scikit-learn (0.17.1)
scipy (0.17.0)
setuptools (20.3)
simplegeneric (0.8.1)
singledispatch (3.4.0.3)
six (1.10.0)
snowballstemmer (1.2.1)
sockjs-tornado (1.0.1)
Sphinx (1.3.5)
sphinx-rtd-theme (0.1.9)
spyder (2.3.8)
SQLAlchemy (1.0.12)
statsmodels (0.6.1)
sympy (1.0)
tables (3.2.2)
terminado (0.5)
toolz (0.7.4)
tornado (4.3)
traitlets (4.2.1)
unicodecsv (0.14.1)
Werkzeug (0.11.4)
wheel (0.29.0)
xlrd (0.9.4)
XlsxWriter (0.8.4)
xlwings (0.7.0)
xlwt (1.0.0)
I also tried creating the conda environment manually but received the same error, any ideas on the best way to proceed from here?
Usually this kind of problems are due to miscompatibility of versions, so the solution would be to update conda before the GraphLab installation:
conda update conda
I got the same problem, try installing some of those packages within the conda environment.
Running:
conda install scikit-learn
conda install scipy
accepting the changes to the packages, and then running the original conda create -n gl-env
command worked for me.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With