I'm installing Apache Spark which uses its own copy of SBT to set things up.
I'm using Linux Mint in a VirtualBox VM.
Here's a snippet from the error when I run sudo ./sbt/sbt compile
from the Spark directory spark-0.9.0-incubating
:
[error] (core/compile:compile) java.io.IOException: Cannot run program "javac": error=2, No such file or directory
[error] Total time: 181 s, completed Mar 9, 2014 12:48:03 PM
I can run java
and javac
from the command line just fine: e.g. javac -version
gives javac 1.6.0_31
The correct jdk1.6.0_31/bin
is in my PATH
.
I read that the error might be due to the 64-bit JDK that I had installed, but I get the same error with the 32 bit JDK.
How can I sort out the issue?
edit: Using bash shell.
DISCLAIMER I'm mostly guessing now and still am unsure I should've responding here rather than adding a comment. Until it's clear, the DISCLAIMER remains.
When you execute java
and javac
from the command line, what user are you at that moment? I'm pretty sure your problems surface because the users you operate are different.
Please notice that you're executing sudo ./sbt/sbt compile
as root
(due to the way sudo
works), but you say nothing about what user(s) you've been using to execute javac
and java
commands.
Add jdk1.6.0_31/bin
to PATH
for root
and you'll be all set (as far as the configuration of Java's concerned).
I'd also recommend setting JAVA_HOME
to point to jdk1.6.0_31
as it may help at times -- many applications are using it as the way to find the location of Java.
As a workaround, you may edit ./sbt/sbt
and add PATH
and JAVA_HOME
appropriately.
You need to include the javac
executable. To do this in Ubuntu please run the following command:
sudo apt-get install openjdk-7-jdk
It also places it within your path
variable.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With