I have install below setup with version: Hadoop version 1.0.3 java version "1.7.0_67" Scala version 2.11.7 Spark version 2.1.1.
getting below error, can any one help me this.
root@sparkmaster:/home/user# spark-shell
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
17/07/05 01:07:35 WARN SparkContext: Support for Java 7 is deprecated as of Spark 2.0.0
17/07/05 01:07:36 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/07/05 01:07:37 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
17/07/05 01:07:37 ERROR SparkContext: Error initializing SparkContext.
java.net.BindException: Cannot assign requested address: Service 'sparkDriver' failed after 16 retries (starting from 0)! Consider explicitly setting the appropriate port for the service 'sparkDriver' (for example spark.ui.port for SparkUI) to an available port or increasing
<console>:14: error: not found: value spark
import spark.implicits._
<console>:14: error: not found: value spark
import spark.sql
Using Scala version 2.11.8 (Java HotSpot(TM) Client VM, Java 1.7.0_67)
Type in expressions to have them evaluated.
Type :help for more information.
scala>
There are a few different solutions
Get your hostname
$ hostname
then try to assign your host name
$ sudo hostname -s 127.0.0.1
Start spark-shell
.
Add your hostname to your /etc/hosts file (if not present)
127.0.0.1 your_hostname
Add env variable
export SPARK_LOCAL_IP="127.0.0.1"
load-spark-env.sh
Above steps solved my problem but you can also try to add
export SPARK_LOCAL_IP=127.0.0.1
under the comment for local IP on template file spark-env.sh.template
(/usr/local/Cellar/apache-spark/2.1.0/libexec/conf/
)
and then
cp spark-env.sh.template spark-env.sh
spark-shell
If none of the above fixes, check your firewall and enable it, if not already enabled
Add SPARK_LOCAL_IP
in load-spark-env.sh
as
export SPARK_LOCAL_IP="127.0.0.1"
The load-spark-env.sh
file is located in spark/bin
directory
Or you can add your hostname
in /etc/hosts
file as
127.0.0.1 hostname
You can get your hostname
by typing hostname
in terminal
Hope this solves the issue!
Had similar issue in my IntelliJ
Reason : I was on cisco anyconnect VPN
Fix : disconnected from the VPN, this issue did not appear
hostname
you can have a look at your current hostname.vim /etc/hosts
and set the hostname you get just now to your exact ip or 127.0.0.1. If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With