I am using Pyspark to run some commands in Jupyter Notebook but it is throwing error. I tried solutions provided in this link (Pyspark: Exception: Java gateway process exited before sending the driver its port number) and I tried doing the solution provided here (such as Changing the path to C:Java, Uninstalling Java SDK 10 and reinstalling Java 8, still it is throwing me the same error.
I tried uninstalling and reinstalling pyspark, and I tried running from anaconda prompt as well still I am getting the same error. I am using Python 3.7 and pyspark version is 2.4.0.
If I use this code, I get this error."Exception: Java gateway process exited before sending its port number".
from pyspark import SparkContext
from pyspark.sql import SQLContext
sc = SparkContext()
sqlContext = SQLContext(sc)
from pyspark.mllib.linalg import Vector, Vectors
from nltk.stem.wordnet import WordNetLemmatizer
from pyspark.ml.feature import RegexTokenizer, StopWordsRemover, Word2Vec
But If I remove sparkcontext from this code runs fine, but I would need spark context for my solution. Below code without spark context does not throw any error.
from pyspark import SparkContext
from pyspark.sql import SQLContext
from pyspark.mllib.linalg import Vector, Vectors
from nltk.stem.wordnet import WordNetLemmatizer
from pyspark.ml.feature import RegexTokenizer, StopWordsRemover, Word2Vec
I would appreciate if I could get any help figuring this out. I am using Windows 10 64 bit operating system.
Here is full error code picture.
Set PYSPARK_SUBMIT_ARGS with master, this resolves Exception: Java gateway process exited before sending the driver its port number. Incase if issue still doesn't resolve, check your Java installation and JAVA_HOME environment variable.
Try This
sudo add-apt-repository ppa:webupd8team/java
sudo apt-get update
sudo apt-get install oracle-java8-installer
Worked for me using linux.It should work for windows too
this link will help you for coz you are an windows user https://superuser.com/questions/947220/how-to-install-packages-apt-get-install-in-windows
Type this in you bash terminal, and it will be fixed:
export PYSPARK_SUBMIT_ARGS="--master local[2] pyspark-shell"
All this does is export pyspark-shell
to the shell environment variable PYSPARK_SUBMIT_ARGS
.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With