Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why I take "spark-shell: Permission denied" error in Spark Setup?

I am new on Apache Spark. I am trying to setup Apache Spark to my Macbook. I download file "spark-2.4.0-bin-hadoop2.7" from Apache Spark official web site.
When I try to run ./bin/spark-shell or ./bin/pyspark I get Permission denied error.
I want to just run spark on my local machine.
I also tried to give permission to all folders but it does not help. Why do I this error?

like image 791
stef Avatar asked Oct 17 '25 09:10

stef


2 Answers

This should solve your problem chmod +x /Users/apple/spark-2.4.0-bin-hadoop2.7/bin/*

Then you could try executing bin/pyspark (spark shell in python) or bin/spark-shell (spark shell in scala).

like image 172
Oli Avatar answered Oct 20 '25 18:10

Oli


I solve this issue by adding /libexec folder to spark home path

set $SPARK_HOME to

/usr/local/Cellar/apache-spark/<your_spark_version>/libexec

like image 27
stef Avatar answered Oct 20 '25 18:10

stef



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!