I was reading Apache mapreduce tutorial
I was able to run the example and get the desired result. However, I am not able to understand how it is asked that we should run following to compile the Java file.
$ bin/hadoop com.sun.tools.javac.Main WordCount.java
I went through the hapdoop command details. It partitioned all hadoop switches under General options, User commands and Administrator commands. I didnt found where above javac
command given in hadoop command page.
Q. In fact actually I didnt get how above command works. I mean we usually specify hyphen-prefixed options in front of the commands. However this above command asks to run javac in a way that I never saw before. I usually used to compile java files directly by javac filename.java
. Why here it is asked to not do this way?
Q. Also there is .Main
in com.sun.tools.javac.Main
in the given command. What does this mean?
I know I must be missing something very basic understanding about how commands works in general. Also I am very new to linux, so may be thats the reason for not understanding this.
Javac is used for compiling your java code. If you notice the WordCount example, there are lot of import statements which require you to add the corresponding jars in the classpath.
when you run hadoop com.sun.tools.javac.Main WordCount.java
, hadoop would have already loaded all these dependent jar for compiling your program. You can test it by running javac WordCount.java and compiler would throw cannot find symbol
error if you don't already have these required jar in your classpath
With the Hadoop, you can set the compiler that you want it to use for compiling your code.
com.sun.tools.javac.Main is the programmatic interface for the Java Programming Language
When you execute bin/hadoop
, with the first parameter as com.sun.tools.javac.Main
and second parameter as WordCount.java
,
it uses the com.sun.tools.javac.Main
to compile WordCount.java
More on com.sun.tools.javac.Main
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With