I'm thinking about using hadoop to process large text files on my existing windows 2003 servers (about 10 quad core machines with 16gb of RAM)
The questions are:
Is there any good tutorial on how to configure an hadoop cluster on windows?
What are the requirements? java + cygwin + sshd ? Anything else?
HDFS, does it play nice on windows?
I'd like to use hadoop in streaming mode. Any advice, tool or trick to develop my own mapper / reducers in c#?
What do you use for submitting and monitoring the jobs?
Thanks
OPERATING SYSTEM: You can install Hadoop on Windows or Linux based operating systems. Ubuntu and CentOS are very commonly used.
I will suggest you two ways that you can follow: Installing Hadoop Using VM: You can setup apache hadoop on a vm. You will find tons of videos or tutorials for it. Here is the link for one - https://www.edureka.co/blog/install-hadoop-single-node-hadoop-cluster Just follow along the instruction specified in the blog.
Or, is it dead altogether? In reality, Apache Hadoop is not dead, and many organizations are still using it as a robust data analytics solution. One key indicator is that all major cloud providers are actively supporting Apache Hadoop clusters in their respective platforms.
To check Hadoop daemons are running or not, what you can do is just run the jps command in the shell. You just have to type 'jps' (make sure JDK is installed in your system). It lists all the running java processes and will list out the Hadoop daemons that are running.
From the Hadoop documentation:
Win32 is supported as a development platform. Distributed operation has not been well tested on Win32, so it is not supported as a production platform.
Which I think translates to: "You're on your own."
That said, there might be hope if you're not queasy about installing Cygwin and a Java shim, according to the Getting Started page of the Hadoop wiki:
It is also possible to run the Hadoop daemons as Windows Services using the Java Service Wrapper (download this separately). This still requires Cygwin to be installed as Hadoop requires its df command.
I guess the bottom line is that it doesn't sound impossible, but you'd be swimming upstream all the way. I've done a few Hadoop installs (on Linux for production, Mac for dev) now, and I wouldn't bother with Windows when it's so straightforward on other platforms.
While not the answer you may want to hear, I would highly recommend repurposing the machines as, say, Linux servers, and running Hadoop there. You will benefit from tutorials and experience and testing performed on that platform, and spend your time solving business problems rather than operational issues.
However, you can still write your jobs in C#. Since Hadoop supports the "streaming" implementation, you can write your jobs in any language. With the Mono framework, you should be able to take pretty much any .NET code written on the Windows platform and just run the same binary on Linux.
You can also access HDFS from Windows fairly easily -- while I don't recommend running the Hadoop services on Windows, you can certainly run the DFS client from the Windows platform to copy files in and out of the distributed file system.
For submitting and monitoring jobs, I think that you're mainly on your own... I don't think that there are any good general-purpose systems developed for Hadoop job management yet.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With