Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Create pyspark kernel for Jupyter

I was looking at Apache Toree to use as Pyspark kernel for Jupyter

https://github.com/apache/incubator-toree

However it was using older version of Spark (1.5.1 vs current 1.6.0). I tried to use this method here http://arnesund.com/2015/09/21/spark-cluster-on-openstack-with-multi-user-jupyter-notebook/ by creating kernel.js

{
 "display_name": "PySpark",
 "language": "python",
 "argv": [
  "/usr/bin/python",
  "-m",
  "ipykernel",
  "-f",
  "{connection_file}"
 ],
 "env": {
  "SPARK_HOME": "/usr/local/Cellar/apache-spark/1.6.0/libexec",
  "PYTHONPATH": "/usr/local/Cellar/apache-spark/1.6.0/libexec/python/:/usr/local/Cellar/apache-spark/1.6.0/libexec/python/lib/py4j-0.9-src.zip",
  "PYTHONSTARTUP": "/usr/local/Cellar/apache-spark/1.6.0/libexec/python/pyspark/shell.py",
  "PYSPARK_SUBMIT_ARGS": "--master local[*] pyspark-shell"
 }
}

However, I got few problems:

  1. There is no /jupyter/kernels path in my Mac. So I ended up creating this path ~/.jupyter/kernels/pyspark. I am not sure if that is the correct path.

  2. Even after having all correct paths, I still don't see PySpark showing up as a kernel inside Jupyter.

What did I miss?

like image 667
HP. Avatar asked Jan 25 '16 17:01

HP.


2 Answers

launch jupyter notebook with python kernel and then run the following commands to initialize pyspark within Jupyter.

import findspark
findspark.init()

import pyspark
sc = pyspark.SparkContext()

FYI: have tried most of the configs to launch Apache Toree with pyspark kernel in Jupyter without success,

like image 134
Neil Kumar Avatar answered Nov 10 '22 02:11

Neil Kumar


Jupyter kernels should go in $JUPYTER_DATA_DIR. On OSX, this is ~/Library/Jupyter. See: http://jupyter.readthedocs.org/en/latest/system.html

like image 35
Spencer Boucher Avatar answered Nov 10 '22 02:11

Spencer Boucher