I'm running the word count example provided by hadoop. The following is the error that i am getting:
-bash-4.1$ ./hadoop jar /home/chanders/Hadoop/bin/hadoop-examples-0.20.203.0.jar wordcount /usr/temp_hadoop /usr/output
Warning: Maximum heap size rounded up to 1024 MB
Warning: Maximum heap size rounded up to 1024 MB
Exception in thread "main" java.io.IOException: Error opening job jar: /home/chanders/Hadoop/bin/hadoop-examples-0.20.203.0.jar
at org.apache.hadoop.util.RunJar.main(RunJar.java:90)
Caused by: java.util.zip.ZipException: error in opening zip file
at java.util.zip.ZipFile.open(Native Method)
at java.util.zip.ZipFile.<init>(ZipFile.java:129)
at java.util.jar.JarFile.<init>(JarFile.java:141)
at java.util.jar.JarFile.<init>(JarFile.java:78)
at org.apache.hadoop.util.RunJar.main(RunJar.java:88)
-bash-4.1$ ^C
-bash-4.1$ ./hadoop jar /home/chanders/Hadoop/bin/hadoop*examples*.jar wordcount /usr/temp_hadoop /usr/output
Warning: Maximum heap size rounded up to 1024 MB
Warning: Maximum heap size rounded up to 1024 MB
Exception in thread "main" java.io.IOException: Error opening job jar: /home/chanders/Hadoop/bin/hadoop*examples*.jar
at org.apache.hadoop.util.RunJar.main(RunJar.java:90)
Caused by: java.util.zip.ZipException: error in opening zip file
at java.util.zip.ZipFile.open(Native Method)
at java.util.zip.ZipFile.<init>(ZipFile.java:129)
at java.util.jar.JarFile.<init>(JarFile.java:141)
at java.util.jar.JarFile.<init>(JarFile.java:78)
at org.apache.hadoop.util.RunJar.main(RunJar.java:88)
Information about my installation: I followed the installation instructions from Michaeal Noll's link Here The only change is that i havent created a separate hadoop user. I'm using it under my login.
There was a mistake in the path for the jar. I corrected it.
One common mistake (seen it on other forums) is due to permissions issues on the folder where hadoop is trying to run the jar file. Fixing this enables hadoop to find the jar file.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With