*I followed https://www.anchormen.nl/spark-docker/ and When i executed the command ,apache startup fails saying "--" is not recognized. I am very new to spark.Requesting help from our trusted community members.
bash /opt/spark/sbin/start-master.sh
starting org.apache.spark.deploy.master.Master, logging to /opt/spark/logs/spark--org.apache.spark.deploy.master.Master-1-e6b8f9219a40.out
failed to launch: nice -n 0 /opt/spark/bin/spark-class org.apache.spark.deploy.master.Master --host e6b8f9219a40 --port 7077 --webui-port 8080
nohup: can't execute '--': No such file or directory
full log in /opt/spark/logs/spark--org.apache.spark.deploy.master.Master-1-e6b8f9219a40.out
* Only difference from the article is that i used alpine linux,which am restricted to.
to verify i tried to cat the log file..and got the same error. alpine linux in docker is not recognizing "--" .am i doing something wrong?
[SOLVED] thanks Robert.
if anyone looking for answer add the following in your docker-file
RUN apk update && apk upgrade && apk add curl ca-certificates tar supervisor bash procps coreutils
Your problem is here in this line
The thing is that the alpine
image comes with busybox
, which is a replacement of many commands including nohup
and ps
. So alpine comes with a non-gnu nohup
which cannot handle that --
, and a ps
which cannot handle -p
.
So, install coreutils
and procps
packages before any call to apache spark scripts in order to have the version of nohup
and ps
that you need.
In Dockerfile or container command line:
RUN apk --update add coreutils procps
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With