Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Issue upon Spark Upgrade : key not found: _PYSPARK_DRIVER_CONN_INFO_PATH

Downloaded the latest Spark version because of the fix for

ERROR AsyncEventQueue:70 - Dropping event from queue appStatus.

After setting environment variables and running the same code in PyCharm, I'm getting this error, which I can't find a solution of.

Exception in thread "main" java.util.NoSuchElementException: key not found: _PYSPARK_DRIVER_CONN_INFO_PATH
    at scala.collection.MapLike$class.default(MapLike.scala:228)
    at scala.collection.AbstractMap.default(Map.scala:59)
    at scala.collection.MapLike$class.apply(MapLike.scala:141)
    at scala.collection.AbstractMap.apply(Map.scala:59)
    at org.apache.spark.api.python.PythonGatewayServer$.main(PythonGatewayServer.scala:64)
    at org.apache.spark.api.python.PythonGatewayServer.main(PythonGatewayServer.scala)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
    at java.lang.reflect.Method.invoke(Method.java:498)
    at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:894)
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:198)
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:228)
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:137)
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

Any help?

like image 792
Aakash Basu Avatar asked Jun 15 '18 05:06

Aakash Basu


Video Answer


1 Answers

i met this question too. The next is what i do, hoping to help you:

1 . find your spark version, my spark's version is 2.4.3;

2 . find your pyspark version, my pyspark,version is 2.2.0;

3 . reinstall your pyspark as same as the spark's version

pip install pyspark==2.4.3

Then everything is ok. Hope to help you.

like image 186
王业强 Avatar answered Oct 23 '22 18:10

王业强