I run a pyspark cluster on AWS EMR and use Jupyter to the pyspark driver. Recently, the script failed. Jupiter fails to start a server. I install with conda install jupyter
and start with sudo initctl start jupyter
. The cluster runs fine. Here are the config values of the server.
# jupyter configs
mkdir -p ~/.jupyter
touch ls ~/.jupyter/jupyter_notebook_config.py
HASHED_PASSWORD=$(python -c "from notebook.auth import passwd;
print(passwd('$JUPYTER_PASSWORD'))")
echo "c.NotebookApp.password = u'$HASHED_PASSWORD'" >> ~/.jupyter/jupyter_notebook_config.py
echo "c.NotebookApp.open_browser = False" >> ~/.jupyter/jupyter_notebook_config.py
echo "c.NotebookApp.ip = '*'" >> ~/.jupyter/jupyter_notebook_config.py
echo "c.NotebookApp.notebook_dir = '/mnt/$BUCKET/$FOLDER'" >> ~/.jupyter/jupyter_notebook_config.py
echo "c.ContentsManager.checkpoints_kwargs = {'root_dir': '.checkpoints'}" >> ~/.jupyter/jupyter_notebook_config.py
echo "c.NotebookApp.port = 8080" >> ~/.jupyter/jupyter_notebook_config.py
I found that after the update to jupyter 5.7. I had to modify the config parameters. Change
echo "c.NotebookApp.ip = '*'" >> ~/.jupyter/jupyter_notebook_config.py
to
echo "c.NotebookApp.ip = '0.0.0.0'" >> ~/.jupyter/jupyter_notebook_config.py
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With