I have installed spark on AWS. When I try to execute on AWS it works, but spark doesn't work, when I check the sparkMaster log I see the next:
Spark Command: /usr/lib/jvm/java-8-oracle/jre/bin/java -cp /home/ubuntu/spark/conf/:/home/ubuntu/spark/jars/* -Xmx1g org.apache.spark$
========================================
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
16/09/12 09:40:18 INFO Master: Started daemon with process name: 5451@server1
16/09/12 09:40:18 INFO SignalUtils: Registered signal handler for TERM
16/09/12 09:40:18 INFO SignalUtils: Registered signal handler for HUP
16/09/12 09:40:18 INFO SignalUtils: Registered signal handler for INT
16/09/12 09:40:18 WARN MasterArguments: SPARK_MASTER_IP is deprecated, please use SPARK_MASTER_HOST
16/09/12 09:40:19 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where a$
16/09/12 09:40:19 INFO SecurityManager: Changing view acls to: ubuntu
16/09/12 09:40:19 INFO SecurityManager: Changing modify acls to: ubuntu
16/09/12 09:40:19 INFO SecurityManager: Changing view acls groups to:
16/09/12 09:40:19 INFO SecurityManager: Changing modify acls groups to:
16/09/12 09:40:19 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set$
16/09/12 09:40:19 WARN Utils: Service 'sparkMaster' could not bind on port 7077. Attempting port 7078.
16/09/12 09:40:19 WARN Utils: Service 'sparkMaster' could not bind on port 7078. Attempting port 7079.
16/09/12 09:40:19 WARN Utils: Service 'sparkMaster' could not bind on port 7079. Attempting port 7080.
16/09/12 09:40:19 WARN Utils: Service 'sparkMaster' could not bind on port 7080. Attempting port 7081.
16/09/12 09:40:19 WARN Utils: Service 'sparkMaster' could not bind on port 7081. Attempting port 7082.
16/09/12 09:40:19 WARN Utils: Service 'sparkMaster' could not bind on port 7082. Attempting port 7083.
16/09/12 09:40:19 WARN Utils: Service 'sparkMaster' could not bind on port 7083. Attempting port 7084.
16/09/12 09:40:19 WARN Utils: Service 'sparkMaster' could not bind on port 7084. Attempting port 7085.
16/09/12 09:40:19 WARN Utils: Service 'sparkMaster' could not bind on port 7085. Attempting port 7086.
16/09/12 09:40:19 WARN Utils: Service 'sparkMaster' could not bind on port 7086. Attempting port 7087.
16/09/12 09:40:19 WARN Utils: Service 'sparkMaster' could not bind on port 7087. Attempting port 7088.
16/09/12 09:40:19 WARN Utils: Service 'sparkMaster' could not bind on port 7088. Attempting port 7089.
16/09/12 09:40:19 WARN Utils: Service 'sparkMaster' could not bind on port 7089. Attempting port 7090.
16/09/12 09:40:19 WARN Utils: Service 'sparkMaster' could not bind on port 7090. Attempting port 7091.
16/09/12 09:40:19 WARN Utils: Service 'sparkMaster' could not bind on port 7091. Attempting port 7092.
16/09/12 09:40:19 WARN Utils: Service 'sparkMaster' could not bind on port 7092. Attempting port 7093.
Exception in thread "main" java.net.BindException: Cannot assign requested address: Service 'sparkMaster' failed after 16 retries! Co$
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:433)
at sun.nio.ch.Net.bind(Net.java:425)
at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:223)
at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:125)
at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:485)
at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1089)
at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:430)
at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:415)
at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:903)
at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:198)
at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:348)
at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:357)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
at java.lang.Thread.run(Thread.java:745)
My /etc/hosts is the next:
127.0.0.1 localhost
52.211.60.97 server1
52.210.246.199 client1
52.211.71.126 client2
52.211.20.213 client3
# The following lines are desirable for IPv6 capable hosts
::1 ip6-localhost ip6-loopback
fe00::0 ip6-localnet
ff00::0 ip6-mcastprefix
ff02::1 ip6-allnodes
ff02::2 ip6-allrouters
And this is my spark-env.sh:
export SPARK_WORKER_MEMORY=512m
export SPARK_EXECUTOR_MEMORY=512m
export SPARK_WORKER_INSTANCES=1
export SPARK_WORKER_CORES=1
export SPARK_WORKER_DIR=/home/ubuntu/spark
export SPARK_LOCAL_IP=52.211.60.97
export SPARK_MASTER_IP=52.211.60.97
export SPARK_MASTER_WEBUI_PORT=4041
I have try the same playbook, but using a AWS VPC with private instances and VPN and it works fine. So I think there is any problem with the public IP, maybe amazon block some ports on the public IP? or what could be the problem?
I was also facing similar issues.
This is caused as the spark master is not able to open port on specified on SPARK_MASTER_IP.
first, find your hostname by hostname
command.
After that, make sure your machine ip address in /etc/hosts
is mapped to given hostname.
After that, use that hostname for SPARK_MASTER_IP.
For this issue, in cluster mode, you can also mention export SPARK_LOCAL_IP=127.0.0.1
.
PS :- I know its late to reply, but can help other person who comes here.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With