Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

py4j.protocol.Py4JJavaError occurred while calling z:org.apache.spark.api.python.PythonRDD.collectAndServe

I installed apache-spark and pyspark on my machine (Ubuntu), and in Pycharm, I also updated the environment variables (e.g. spark_home, pyspark_python). I'm trying to do:

import os, sys
os.environ['SPARK_HOME'] = ".../spark-2.3.0-bin-hadoop2.7"
sys.path.append(".../spark-2.3.0-bin-hadoop2.7/bin/pyspark/")
sys.path.append(".../spark-2.3.0-bin-hadoop2.7/python/lib/py4j-0.10.6-src.zip")
from pyspark import SparkContext
from pyspark import SparkConf
sc = SparkContext('local[2]')
words = sc.parallelize(["scala", "java", "hadoop", "spark", "akka"])
print(words.count())

But, I receive some weird warnings:

py4j.protocol.Py4JJavaError: An error occurred while calling z:org.apache.spark.api.python.PythonRDD.collectAndServe.
: java.lang.IllegalArgumentException
at org.apache.xbean.asm5.ClassReader.<init>(Unknown Source)
at org.apache.xbean.asm5.ClassReader.<init>(Unknown Source)
at org.apache.xbean.asm5.ClassReader.<init>(Unknown Source)
at org.apache.spark.util.ClosureCleaner$.getClassReader(ClosureCleaner.scala:46)
at org.apache.spark.util.FieldAccessFinder$$anon$3$$anonfun$visitMethodInsn$2.apply(ClosureCleaner.scala:449)
at org.apache.spark.util.FieldAccessFinder$$anon$3$$anonfun$visitMethodInsn$2.apply(ClosureCleaner.scala:432)
at scala.collection.TraversableLike$WithFilter$$anonfun$foreach$1.apply(TraversableLike.scala:733)
at scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.apply(HashMap.scala:103)
at scala.collection.mutable.HashMap$$anon$1$$anonfun$foreach$2.apply(HashMap.scala:103)
at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
at scala.collection.mutable.HashMap$$anon$1.foreach(HashMap.scala:103)
at scala.collection.TraversableLike$WithFilter.foreach(TraversableLike.scala:732)
at org.apache.spark.util.FieldAccessFinder$$anon$3.visitMethodInsn(ClosureCleaner.scala:432)
at org.apache.xbean.asm5.ClassReader.a(Unknown Source)
at org.apache.xbean.asm5.ClassReader.b(Unknown Source)
at org.apache.xbean.asm5.ClassReader.accept(Unknown Source)
at org.apache.xbean.asm5.ClassReader.accept(Unknown Source)
at org.apache.spark.util.ClosureCleaner$$anonfun$org$apache$spark$util$ClosureCleaner$$clean$14.apply(ClosureCleaner.scala:262)
at org.apache.spark.util.ClosureCleaner$$anonfun$org$apache$spark$util$ClosureCleaner$$clean$14.apply(ClosureCleaner.scala:261)
at scala.collection.immutable.List.foreach(List.scala:381)
at org.apache.spark.util.ClosureCleaner$.org$apache$spark$util$ClosureCleaner$$clean(ClosureCleaner.scala:261)
at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:159)
at org.apache.spark.SparkContext.clean(SparkContext.scala:2292)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2066)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:2092)
at org.apache.spark.rdd.RDD$$anonfun$collect$1.apply(RDD.scala:939)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
at org.apache.spark.rdd.RDD.withScope(RDD.scala:363)
at org.apache.spark.rdd.RDD.collect(RDD.scala:938)
at org.apache.spark.api.python.PythonRDD$.collectAndServe(PythonRDD.scala:153)
at org.apache.spark.api.python.PythonRDD.collectAndServe(PythonRDD.scala)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:564)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
at py4j.Gateway.invoke(Gateway.java:282)
at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
at py4j.commands.CallCommand.execute(CallCommand.java:79)
at py4j.GatewayConnection.run(GatewayConnection.java:214)
at java.base/java.lang.Thread.run(Thread.java:844)

How can I solve this problem?

like image 668
Saeid SOHEILY KHAH Avatar asked Apr 27 '18 14:04

Saeid SOHEILY KHAH


2 Answers

I need to maintain both OpenJDK 11 and JDK 8 for different purposes, so downgrading is not an option. For Spark Programs, I leverage by exporting (overriding) JAVA_HOME path pointing to JDK8 as below.

export JAVA_HOME=/Library/Java/JavaVirtualMachines/jdk1.8.0_65.jdk/Contents/Home/
like image 88
navjot mannan Avatar answered Oct 04 '22 15:10

navjot mannan


Actually, I found a tricky solution. To solve the following problem:

Be sure that you installed Py4j, correctly. It's better to install it by using an official release. To do,

  1. download the latest official release from from https://pypi.org/project/py4j/.

  2. untar/unzip the file and navigate to the newly created directory, e.g., cd py4j-0.x.

  3. run

    sudo python(3) setup.py install

Then downgrade your Java to version 8 (previously, I used version 10.). To do, first remove the current version of Java using:

sudo apt-get purge openjdk-\* icedtea-\* icedtea6-\*

and then Install Java 8 using:

sudo apt install openjdk-8-jre-headless 

Now the code works for me properly.

like image 40
Saeid SOHEILY KHAH Avatar answered Oct 04 '22 13:10

Saeid SOHEILY KHAH