I keep trying to get the Spark shell running to no avail.
Downloaded both prepackaged and unpackaged variants (unpackaged variant built by maven and simple build tool). I've attempted to resolve my issue three different ways to no avail.
1) From my Spark Directory, I attempt to start its shell with variants of spark-shell.cmd
or .\bin\spark-shell.cmd
.
I consistently get an error along these lines:
'C:\Program' is not recognized as an internal or external command, operable program or batch file.
Knowing a possible whitespace error when I see one, I've attempted variants of my command with quotes, full paths, etc. No results thus far.
2) Next, I've tried simply moving my spark directory to highest level of my harddrive (C:\ \spark-1.3.1-bin-hadoop2.6).
With whitespace eliminated as a possible issue, my error messages now fall along these lines:
find: 'version': No such file or directory else was unexpected at this time.
3) I've tried to invoke Spark through Scala somehow (as some documents and screencasts given the impression). I can confirm that Scala (2.11.6) is properly configured within my environment variables. Its shell works correctly.
If there's supposedly a command to make it start the Spark shell, I'm listening. Current attempts through Scala is another dead end.
Thank you.
In this file bin\spark-class2.cmd
find the line
set RUNNER="%JAVA_HOME%\bin\java"
and replace it with (remove the ")
set RUNNER=%JAVA_HOME%\bin\java
Moving Spark directory under C:\ worked for me.(C:\spark-1.6.0-bin-hadoop2.6).
I also updated the system's PATH variable for find.exe
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With