Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to start apache-spark slave instance on a standalone environment?

This are the steps I've done so far:

  1. Download spark-1.4.1-bin-hadoop2.6.tgz
  2. unzip
  3. .spark-1.4.1-bin-hadoop2.6/sbin/start-all.sh

Master is working but slave doesn't start

This is the output:

[ec2-user@ip-172-31-24-107 ~]$ sudo ./spark-1.4.1-bin-hadoop2.6/sbin/start-all.sh
starting org.apache.spark.deploy.master.Master, logging to /home/ec2-user/spark-1.4.1-bin-hadoop2.6/sbin/../logs/spark-root-org.apache.spark.deploy.master.Master-1-ip-172-31-24-107.out
localhost: Permission denied (publickey).
[ec2-user@ip-172-31-24-107 ~]$

This is the secure log

Aug  9 00:09:30 ip-172-31-24-107 sudo: ec2-user : TTY=pts/0 ; PWD=/home/ec2-user ; USER=root ; COMMAND=./spark-1.4.1-bin-hadoop2.6/sbin/start-all.sh
Aug  9 00:09:32 ip-172-31-24-107 sshd[4828]: Connection closed by 127.0.0.1 [preauth]

I believe the problem is with SSH but I haven't been able to find the solution on google...

Any idea how to fix my SSH issue?

like image 436
masber Avatar asked Nov 14 '25 17:11

masber


1 Answers

You need to set up passwordless ssh. Try:

 cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys

Then restart the cluster. If that does not work please post new error message(s).

like image 106
WestCoastProjects Avatar answered Nov 17 '25 09:11

WestCoastProjects



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!