Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Spark: Trying to run spark-shell, but get 'cmd' is not recognized as an internal or

Tags:

apache-spark

I'm trying to install Spark on my Windows desktop. Everything should work fine, but I get an error "'cmd' is not recognized as an internal or external command... "

I installed Scala, Java JDK and unzipped Spark tgz in C:\, but for some reason can't get Spark to start in cmd. Any ideas?

like image 897
ElinaJ Avatar asked Jun 21 '15 13:06

ElinaJ


5 Answers

My colleague solved the problem. Although Java seemed to work ok (ref. picture), the Java path Spark was trying to read was incorrect with an extra \bin at the end. When that was removed, Spark started working! @gonbe, thank you so much for your efforts to help!

like image 95
ElinaJ Avatar answered Oct 11 '22 11:10

ElinaJ


I was the getting the same error while executing Spark-shell in the command prompt.

I tried everything mentioned above but not able to resolve the issue.

So, at last I added "C:\Windows\System32" in 'PATH' variable of System Variable and it worked.

like image 30
Rishav Sharma Avatar answered Sep 19 '22 01:09

Rishav Sharma


(I'm not Windows Spark user) The spark-shell.cmd for Windows source code expects "cmd" command is available in PATH.

https://github.com/apache/spark/blob/master/bin/spark-shell.cmd

Would you try adding the directory that contains "cmd.exe" in PATH environment variable? The directory location is shown title bar in your screenshot, and environment variable setting can be done via control panel.

like image 39
suztomo Avatar answered Oct 11 '22 12:10

suztomo


I had the similar error. I fixed it after following changes:

  1. There were multiple Java/bin path in the System Path. So I corrected them to reflect single Java/Bin, which is in sync with JAVA_HOME
  2. Added C:Windows\system32 to System Path Variable.
  3. My Java_Home and java.exe was pointing different places. I fixed them.

Now it works.

Thanks guys.

like image 7
Saravanan Subramanian Avatar answered Oct 11 '22 13:10

Saravanan Subramanian


Check values in JAVA_HOME and make sure it is pointing to correct value. Add %JAVA_HOME%/bin in path value. After modification close command prompt and restart it. Write spark-shell and it will run.

like image 4
Mayank Gupta Avatar answered Oct 11 '22 12:10

Mayank Gupta