I want to add hbase classpath to my spark, but I got error when I run the hbase classpath
command.
I have the hadoop 3.2.0 set up locally with java 1.8 in env.
$ hbase classpath
/usr/lib/hadoop/libexec/hadoop-functions.sh: line 2364: HADOOP_ORG.APACHE.HADOOP.HBASE.UTIL.GETJAVAPROPERTY_USER: invalid variable name /usr/lib/hadoop/libexec/hadoop-functions.sh: line 2459: HADOOP_ORG.APACHE.HADOOP.HBASE.UTIL.GETJAVAPROPERTY_OPTS: invalid variable name Error: Could not find or load main class org.apache.hadoop.hbase.util.GetJavaProperty
Obviously old question, but your configuration may be wrong.
This is potentially caused by insufficient privilege. You may want to try "sudo" and the command to troubleshoot. This fixed confirmed that I had a privilege issue when the command in hadoop-functions.sh was being executed.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With