Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Hadoop DistributedCache is deprecated - what is the preferred API?

My map tasks need some configuration data, which I would like to distribute via the Distributed Cache.

The Hadoop MapReduce Tutorial shows the usage of the DistributedCache class, roughly as follows:

// In the driver JobConf conf = new JobConf(getConf(), WordCount.class); ... DistributedCache.addCacheFile(new Path(filename).toUri(), conf);   // In the mapper Path[] myCacheFiles = DistributedCache.getLocalCacheFiles(job); ... 

However, DistributedCache is marked as deprecated in Hadoop 2.2.0.

What is the new preferred way to achieve this? Is there an up-to-date example or tutorial covering this API?

like image 673
DNA Avatar asked Jan 20 '14 16:01

DNA


People also ask

What is a distributed cache in Hadoop?

What is Hadoop Distributed Cache? Distributed cache in Hadoop is a way to copy small files or archives to worker nodes in time. Hadoop does this so that these worker nodes can use them when executing a task. To save the network bandwidth the files get copied once per job.


1 Answers

The APIs for the Distributed Cache can be found in the Job class itself. Check the documentation here: http://hadoop.apache.org/docs/stable2/api/org/apache/hadoop/mapreduce/Job.html The code should be something like

Job job = new Job(); ... job.addCacheFile(new Path(filename).toUri()); 

In your mapper code:

Path[] localPaths = context.getLocalCacheFiles(); ... 
like image 75
user2371156 Avatar answered Sep 22 '22 13:09

user2371156