I'm new on Hadoop world, and I need install mesos with Hadoop HDFS to make a fault-tolerant distributed file system, but all installation references include necessary components for my scenario as for example: MapReduce.
Do you have any idea or references about this?
No, you cannot download HDFS alone because Hadoop 2. X has four core components: HDFS – It is the core component of Hadoop Ecosystem which is used to store a huge amount of data. Map Reduce – It is used for processing of large distributed datasets parallelly.
There are two ways to install Hadoop, i.e. Single node and Multi-node.
Environment required for Hadoop: The production environment of Hadoop is UNIX, but it can also be used in Windows using Cygwin. Java 1.6 or above is needed to run Map Reduce Programs. For Hadoop installation from tar ball on the UNIX environment you need. Java Installation.
Absolutely possible. Don't think Hadoop as an installable program, it's just composed by a bunch of java processes running on different nodes inside a cluster.
If you use hadoop tar ball, you can just run NameNode and DataNodes processes if you only want HDFS.
If you use other hadoop distros (HDP for instance), I think HDFS and mapreduce come from different rpm packages, but it does harm to install both rpm packages. Again just run NameNode and DataNodes if you only need HDFS.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With