When I run hive
, it shows the following error
[hadoop@a74f90aecdad tmp]$ hive
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/hive/lib/log4j-slf4j-impl-2.10.0.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/hadoop/share/hadoop/common/lib/slf4j-log4j12-1.7.25.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]
Exception in thread "main" java.lang.NoSuchMethodError: com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;Ljava/lang/Object;)V
at org.apache.hadoop.conf.Configuration.set(Configuration.java:1357)
at org.apache.hadoop.conf.Configuration.set(Configuration.java:1338)
at org.apache.hadoop.mapred.JobConf.setJar(JobConf.java:536)
at org.apache.hadoop.mapred.JobConf.setJarByClass(JobConf.java:554)
at org.apache.hadoop.mapred.JobConf.<init>(JobConf.java:448)
at org.apache.hadoop.hive.conf.HiveConf.initialize(HiveConf.java:5141)
at org.apache.hadoop.hive.conf.HiveConf.<init>(HiveConf.java:5099)
at org.apache.hadoop.hive.common.LogUtils.initHiveLog4jCommon(LogUtils.java:97)
at org.apache.hadoop.hive.common.LogUtils.initHiveLog4j(LogUtils.java:81)
at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:699)
at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:683)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.hadoop.util.RunJar.run(RunJar.java:323)
at org.apache.hadoop.util.RunJar.main(RunJar.java:236)
What does it mean? Is it possible to show more details?
You can usually find the cause of a Hive error in the syslog file, which you link to from the Steps pane. If you can't determine the problem there, check in the Hadoop task attempt error message. Link to it on the Task Attempts pane.
This issue generally occurs when the memory configured for the 'Map' task is insufficient to complete the task. To resolve the issue, it would be required to increase the memory for 'Map Task' from existing value.
Summarizing my answer from a similar question on Super User: https://superuser.com/a/1501841/115371
Hadoop core updated its Guava version in HADOOP-15960, which went out in Hadoop 3.0.4, 3.1.3, and 3.2.1.
Hive 3.1.2 ships its own incompatible version of Guava, not only in lib/guava-*.jar
, but also hive-exec-*.jar
, a "fat JAR" that repackages several external libraries. The Hive services/CLI don't run without hive-exec-*.jar
, which means that we can't just delete guava-*.jar
to get around the compatibility problem.
At the moment, the only options are either:
hive-exec-*.jar
problem, orThis error pops, when a program is compiled using one version of Jar and executed on other (typically old) version of the same Jar. In this Scenario, mostly there might be a conflict in the guava Jar file version between Hadoop and Hive.
step 1: Go to the HADOOP_INSTALLATION_DIR/share/hadoop/common/lib and check the guava.jar version
step 2: Now go to HIVE_INSTALLATION_DIR/lib and compare the guava file version of hive with hadoop. If they are not same, delete the older version among them and copy the newer version in both.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With