Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Py4JJavaError java.lang.NullPointerException org.apache.spark.sql.DataFrameWriter.jdbc

I got this error when i tried to write a spark dataframe to postgres DB. I am using a local cluster and the code is as follows:

from pyspark import SparkContext
from pyspark import SQLContext, SparkConf
import os

os.environ["SPARK_CLASSPATH"] = '/usr/share/java/postgresql-jdbc4.jar'

conf = SparkConf() \
.setMaster('local[2]') \
.setAppName("test")

sc = SparkContext(conf=conf)
sqlContext = SQLContext(sc)

df = sc.parallelize([("a", "b", "c", "d")]).toDF()

url_connect = "jdbc:postgresql://localhost:5432"
table = "table_test"
mode = "overwrite"
properties = {"user":"postgres", "password":"12345678"}
df.write.option('driver', 'org.postgresql.Driver').jdbc(
     url_connect, table, mode, properties)

The error log is as follows:

Py4JJavaError: An error occurred while calling o119.jdbc.
: java.lang.NullPointerException
at  org.apache.spark.sql.DataFrameWriter.jdbc(DataFrameWriter.scala:308)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:231)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:381)
at py4j.Gateway.invoke(Gateway.java:259)
at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:133)
at py4j.commands.CallCommand.execute(CallCommand.java:79)
at py4j.GatewayConnection.run(GatewayConnection.java:209)
at java.lang.Thread.run(Thread.java:745)

I have tried search an answer from the web but could not find any. Thank you in advance!

like image 983
Yiliang Avatar asked Nov 09 '22 11:11

Yiliang


1 Answers

Have you tried specifying the database in your table_test variable? I have a similar implementation that looks like this:

mysqlUrl = "jdbc:mysql://mysql:3306"
properties = {'user':'root',
              'password':'password',
              'driver':'com.mysql.cj.jdbc.Driver'
              }
table = 'db_name.table_name'

try:
    schemaDF = spark.read.jdbc(mysqlUrl, table, properties=properties)
    print 'schema DF loaded'
except Exception, e:
    print 'schema DF does not exist!'
like image 170
collin.clark Avatar answered Nov 15 '22 06:11

collin.clark