Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to add third-party Java JAR files for use in PySpark

I have some third-party database client libraries in Java. I want to access them through

java_gateway.py 

E.g.: to make the client class (not a JDBC driver!) available to the Python client via the Java gateway:

java_import(gateway.jvm, "org.mydatabase.MyDBClient") 

It is not clear where to add the third-party libraries to the JVM classpath. I tried to add to file compute-classpath.sh, but that did not seem to work. I get:

Py4jError: Trying to call a package

Also, when comparing to Hive: the hive JAR files are not loaded via file compute-classpath.sh, so that makes me suspicious. There seems to be some other mechanism happening to set up the JVM side classpath.

like image 351
WestCoastProjects Avatar asked Dec 30 '14 00:12

WestCoastProjects


2 Answers

You can add external jars as arguments to pyspark

pyspark --jars file1.jar,file2.jar 
like image 178
Marl Avatar answered Sep 20 '22 16:09

Marl


You could add the path to jar file using Spark configuration at Runtime.

Here is an example :

conf = SparkConf().set("spark.jars", "/path-to-jar/spark-streaming-kafka-0-8-assembly_2.11-2.2.1.jar")  sc = SparkContext( conf=conf) 

Refer the document for more information.

like image 20
AAB Avatar answered Sep 22 '22 16:09

AAB