I am trying to install pyspark as this:
python setup.py install
I get this error:
Could not import pypandoc - required to package PySpark
pypandoc is installed already
Any ideas how can I install pyspark?
To test if your installation was successful, open Command Prompt, change to SPARK_HOME directory and type bin\pyspark. This should start the PySpark shell which can be used to interactively work with Spark. The last message provides a hint on how to work with Spark in the PySpark shell using the sc or sqlContext names.
I faced the same issue and solved it as below install pypandoc before installing pyspark
pip install pypandoc
pip install pyspark
                        Try installing pypandoc with python3 with pip3 install pypandoc.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With