Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How can I include additional jars when starting a Google DataProc cluster to use with Jupyter notebooks?

I am following the instructions for starting a Google DataProc cluster with an initialization script to start a jupyter notebook.

https://cloud.google.com/blog/big-data/2017/02/google-cloud-platform-for-data-scientists-using-jupyter-notebooks-with-apache-spark-on-google-cloud

How can I include extra JAR files (spark-xml, for example) in the resulting SparkContext in Jupyter notebooks (particularly pyspark)?

like image 508
seandavi Avatar asked Sep 07 '17 20:09

seandavi


People also ask

How do I edit a Dataproc cluster?

You can update a cluster by issuing a Dataproc API clusters. patch request, running a gcloud dataproc clusters update command in a local terminal window or in Cloud Shell, or by editing cluster parameters from the Configuration tab of the Cluster details page for the cluster in the Google Cloud console.


1 Answers

The answer depends slightly on which jars you're looking to load. For example, you can use spark-xml with the following when creating a cluster:

$ gcloud dataproc clusters create [cluster-name] \
    --zone [zone] \
    --initialization-actions \
       gs://dataproc-initialization-actions/jupyter/jupyter.sh \ 
    --properties spark:spark.jars.packages=com.databricks:spark-xml_2.11:0.4.1

To specify multiple Maven coordinates, you will need to swap the gcloud dictionary separator character from ',' to something else (as we need to use that to separate the packages to install):

$ gcloud dataproc clusters create [cluster-name] \
    --zone [zone] \
    --initialization-actions \
       gs://dataproc-initialization-actions/jupyter/jupyter.sh \ 
    --properties=^#^spark:spark.jars.packages=artifact1,artifact2,artifact3

Details on how escape characters are changed can be found in gcloud:

$ gcloud help topic escaping
like image 180
Angus Davis Avatar answered Oct 02 '22 00:10

Angus Davis