I am trying to fire the jupyter notebook when I run the command pyspark in the console. When I type it now, it only starts and interactive shell in the console. However, this is not convenient to type long lines of code. Is there are way to connect the jupyter notebook to pyspark shell? Thanks.
I'm assuming you already have spark and jupyter notebooks installed and they work flawlessly independent of each other.
If that is the case, then follow the steps below and you should be able to fire up a jupyter notebook with a (py)spark backend.
Go to your spark installation folder and there should be a bin directory there:
/path/to/spark/bin
Create a file, let's call it start_pyspark.sh
Open start_pyspark.sh and write something like:
#!/bin/bash
export PYSPARK_PYTHON=/path/to/anaconda3/bin/python
export PYSPARK_DRIVER_PYTHON=/path/to/anaconda3/bin/jupyter
export PYSPARK_DRIVER_PYTHON_OPTS="notebook --NotebookApp.open_browser=False --NotebookApp.ip='*' --NotebookApp.port=8880"
pyspark "$@"
Replace the /path/to ... with the path where you have installed your python and jupyter binaries respectively.
Most probably this step is already done, but just in case
Modify your ~/.bashrc file by adding the following lines
# Spark
export PATH="/path/to/spark/bin:/path/to/spark/sbin:$PATH"
export SPARK_HOME="/path/to/spark"
export SPARK_CONF_DIR="/path/to/spark/conf"
Run source ~/.bashrc and you are set.
Go ahead and try start_pyspark.sh.
You could also give arguments to the script, something like
start_pyspark.sh --packages dibbhatt:kafka-spark-consumer:1.0.14.
Hope it works out for you.

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With