I am trying to import and use pyspark
with anaconda.
After installing spark, and setting the $SPARK_HOME
variable I tried:
$ pip install pyspark
This won't work (of course) because I discovered that I need to tel python to look for pyspark
under $SPARK_HOME/python/
. The problem is that to do that, I need to set the $PYTHONPATH
while anaconda don't use that environment variable.
I tried to copy the content of $SPARK_HOME/python/
to ANACONDA_HOME/lib/python2.7/site-packages/
but it won't work.
Is there any solution to use pyspark in anaconda?
There are two ways to get PySpark available in a Jupyter Notebook: Configure PySpark driver to use Jupyter Notebook: running pyspark will automatically open a Jupyter Notebook. Load a regular Jupyter Notebook and load PySpark using findSpark package.
This may have only become possible recently, but I used the following and it worked perfectly. After this, I am able to 'import pyspark as ps' and use it with no problems.
conda install -c conda-forge pyspark
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With