I am trying to run the same code(org.apache.hadoop.hbase.mapreduce.Export) Export class by adding the all the required jars to the class path from java command line path( ./java -cp ".:/npachava/*" Export test /test) ,I am getting following error.
Exception in thread "main" java.io.IOException: Cannot initialize Cluster. Please check your configuration for mapreduce.framework.name and the correspond server addresses.
at org.apache.hadoop.mapreduce.Cluster.initialize(Cluster.java:120)
at org.apache.hadoop.mapreduce.Cluster.<init>(Cluster.java:82)
at org.apache.hadoop.mapreduce.Cluster.<init>(Cluster.java:75)
at org.apache.hadoop.mapreduce.Job$9.run(Job.java:1260)
at org.apache.hadoop.mapreduce.Job$9.run(Job.java:1256)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:415)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1657)
at org.apache.hadoop.mapreduce.Job.connect(Job.java:1256)
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1284)
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1308)
at Export.main(Export.java:194)
But running from the command prompt fromthe hbase bin directory( ./hbase org.apache.hadoop.hbase.mapreduce.Driver export test /TestTableData) works perfectly fine.
I have tried to set the config to yarn/local but both didnt worked
Configuration conf = HBaseConfiguration.create();
conf.set("mapreduce.framework.name", "yarn"); also with "local"
anyone please help
I am running with hbase 0.94.17 version on my linux.
add hadoop-mapreduce-client-jobclient.jar (select hadoop version before downloading).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With