I want to run Spark on a local machine using pyspark. From here I use the commands:
sbt/sbt assembly
$ ./bin/pyspark
The install completes, but pyspark is unable to run, resulting in the following error (in full):
138:spark-0.9.1 comp_name$ ./bin/pyspark
Python 2.7.6 |Anaconda 1.9.2 (x86_64)| (default, Jan 10 2014, 11:23:15)
[GCC 4.0.1 (Apple Inc. build 5493)] on darwin
Type "help", "copyright", "credits" or "license" for more information.
Traceback (most recent call last):
File "/Users/comp_name/Downloads/spark-0.9.1/python/pyspark/shell.py", line 32, in <module>
sc = SparkContext(os.environ.get("MASTER", "local"), "PySparkShell", pyFiles=add_files)
File "/Users/comp_name/Downloads/spark-0.9.1/python/pyspark/context.py", line 123, in __init__
self._jsc = self._jvm.JavaSparkContext(self._conf._jconf)
File "/Users/comp_name/Downloads/spark-0.9.1/python/lib/py4j-0.8.1-src.zip/py4j/java_gateway.py", line 669, in __call__
File "/Users/comp_name/Downloads/spark-0.9.1/python/lib/py4j-0.8.1-src.zip/py4j/protocol.py", line 300, in get_return_value
py4j.protocol.Py4JJavaError: An error occurred while calling None.org.apache.spark.api.java.JavaSparkContext.
: java.net.UnknownHostException: 138.7.100.10.in-addr.arpa: 138.7.100.10.in-addr.arpa: nodename nor servname provided, or not known
at java.net.InetAddress.getLocalHost(InetAddress.java:1466)
at org.apache.spark.util.Utils$.findLocalIpAddress(Utils.scala:355)
at org.apache.spark.util.Utils$.localIpAddress$lzycompute(Utils.scala:347)
at org.apache.spark.util.Utils$.localIpAddress(Utils.scala:347)
at org.apache.spark.util.Utils$.localIpAddressHostname$lzycompute(Utils.scala:348)
at org.apache.spark.util.Utils$.localIpAddressHostname(Utils.scala:348)
at org.apache.spark.util.Utils$$anonfun$localHostName$1.apply(Utils.scala:395)
at org.apache.spark.util.Utils$$anonfun$localHostName$1.apply(Utils.scala:395)
at scala.Option.getOrElse(Option.scala:120)
at org.apache.spark.util.Utils$.localHostName(Utils.scala:395)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:124)
at org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:47)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:234)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:379)
at py4j.Gateway.invoke(Gateway.java:214)
at py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:79)
at py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:68)
at py4j.GatewayConnection.run(GatewayConnection.java:207)
at java.lang.Thread.run(Thread.java:724)
Caused by: java.net.UnknownHostException: 138.7.100.10.in-addr.arpa: nodename nor servname provided, or not known
at java.net.Inet6AddressImpl.lookupAllHostAddr(Native Method)
at java.net.InetAddress$1.lookupAllHostAddr(InetAddress.java:894)
at java.net.InetAddress.getAddressesFromNameService(InetAddress.java:1286)
at java.net.InetAddress.getLocalHost(InetAddress.java:1462)
... 22 more
Any ideas what I am doing wrong? I don't know where the IP address 138.7.100.10
comes from.
I get this error when using (or not) MAMP to create a localhost.
Thanks in advance!
you can find most of PySpark python file in spark-3.0. 0-bin-hadoop3. 2/python/pyspark . so if you'd like to use java or scala interface, and deploy distribute system with hadoop, you must download full Spark from Apache Spark and install it.
The right solution is to set SPARK_LOCAL_IP environment variable to localhost or whatever your host name is.
I had the same problem with Spark and it is related to your Laptop IP.
My solution:
sudo /etc/hosts
below
127.0.0.1 localhost localhost.localdomain localhost4 localhost4.localdomain4
add
127.0.0.1 LAPTOPNAME
your LAPTOPNAME can be found with your Terminal and it is root@LAPTOPNAME (whichever you have set up during your installation)
It will run with Java1.7
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With