My map tasks need some configuration data, which I would like to distribute via the Distributed Cache.
The Hadoop MapReduce Tutorial shows the usage of the DistributedCache class, roughly as follows:
// In the driver JobConf conf = new JobConf(getConf(), WordCount.class); ... DistributedCache.addCacheFile(new Path(filename).toUri(), conf); // In the mapper Path[] myCacheFiles = DistributedCache.getLocalCacheFiles(job); ...
However, DistributedCache
is marked as deprecated in Hadoop 2.2.0.
What is the new preferred way to achieve this? Is there an up-to-date example or tutorial covering this API?
What is Hadoop Distributed Cache? Distributed cache in Hadoop is a way to copy small files or archives to worker nodes in time. Hadoop does this so that these worker nodes can use them when executing a task. To save the network bandwidth the files get copied once per job.
The APIs for the Distributed Cache can be found in the Job class itself. Check the documentation here: http://hadoop.apache.org/docs/stable2/api/org/apache/hadoop/mapreduce/Job.html The code should be something like
Job job = new Job(); ... job.addCacheFile(new Path(filename).toUri());
In your mapper code:
Path[] localPaths = context.getLocalCacheFiles(); ...
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With