Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to import pyspark in anaconda

I am trying to import and use pyspark with anaconda.

After installing spark, and setting the $SPARK_HOME variable I tried:

$ pip install pyspark

This won't work (of course) because I discovered that I need to tel python to look for pyspark under $SPARK_HOME/python/. The problem is that to do that, I need to set the $PYTHONPATH while anaconda don't use that environment variable.

I tried to copy the content of $SPARK_HOME/python/ to ANACONDA_HOME/lib/python2.7/site-packages/ but it won't work.

Is there any solution to use pyspark in anaconda?

like image 848
farhawa Avatar asked Nov 19 '15 20:11

farhawa


People also ask

Can we use PySpark in Jupyter notebook?

There are two ways to get PySpark available in a Jupyter Notebook: Configure PySpark driver to use Jupyter Notebook: running pyspark will automatically open a Jupyter Notebook. Load a regular Jupyter Notebook and load PySpark using findSpark package.


1 Answers

This may have only become possible recently, but I used the following and it worked perfectly. After this, I am able to 'import pyspark as ps' and use it with no problems.

conda install -c conda-forge pyspark

like image 128
mewa6 Avatar answered Sep 22 '22 06:09

mewa6