I used hadoop hive 0.9.0 and 1.1.2 and netbeans, but I got this error and I can not solve this problem please help me code :
public class Hive_test {
private static String driverName = "org.apache.hadoop.hive.jdbc.HiveDriver";
@SuppressWarnings("CallToThreadDumpStack")
public static void main(String[] args) throws SQLException {
try {
Class.forName(driverName);
} catch (ClassNotFoundException e){
e.printStackTrace();
System.exit(1);
}
System.out.println("commencer la connexion");
Connection con = DriverManager.getConnection("jdbc:hive://localhost:10000/default",""," ");
Statement stmt = con.createStatement();
ResultSet res = stmt.executeQuery("select * from STATE");
while (res.next()){
System.out.println(String.valueOf(res.getInt(1)) + "\t" + res.getString(2));
System.out.println("sql terminer");
}
}
Error below;
error :
commencer la connexion
Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
at org.apache.thrift.protocol.TBinaryProtocol.readStringBody(TBinaryProtocol.java:353)
at org.apache.thrift.protocol.TBinaryProtocol.readMessageBegin(TBinaryProtocol.java:215)
at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:69)
at org.apache.hadoop.hive.service.ThriftHive$Client.recv_execute(ThriftHive.java:116)
at org.apache.hadoop.hive.service.ThriftHive$Client.execute(ThriftHive.java:103)
at org.apache.hadoop.hive.jdbc.HiveStatement.executeQuery(HiveStatement.java:192)
at org.apache.hadoop.hive.jdbc.HiveStatement.execute(HiveStatement.java:132)
at org.apache.hadoop.hive.jdbc.HiveConnection.configureConnection(HiveConnection.java:132)
at org.apache.hadoop.hive.jdbc.HiveConnection.<init>(HiveConnection.java:122)
at org.apache.hadoop.hive.jdbc.HiveDriver.connect(HiveDriver.java:106)
at java.sql.DriverManager.getConnection(DriverManager.java:571)
at java.sql.DriverManager.getConnection(DriverManager.java:215)
at hive.Hive_test.main(Hive_test.java:22)
OutOfMemoryError: Java heap space. 1) An easy way to solve OutOfMemoryError in java is to increase the maximum heap size by using JVM options "-Xmx512M", this will immediately solve your OutOfMemoryError.
From the main menu, select Help | Change Memory Settings. Set the necessary amount of memory that you want to allocate and click Save and Restart.
lang. OutOfMemoryError exception. Usually, this error is thrown when there is insufficient space to allocate an object in the Java heap. In this case, The garbage collector cannot make space available to accommodate a new object, and the heap cannot be expanded further.
You can set the container heapsize in Hive and resolve this error:
Most tools that operate on top of the Hadoop MapReduce framework provide ways to tune these Hadoop level settings for its jobs. There are multiple ways to do this in Hive. Three of these are shown here:
1) Pass it directly via the Hive command line:
hive -hiveconf mapreduce.map.memory.mb=4096 -hiveconf mapreduce.reduce.memory.mb=5120 -e "select count(*) from test_table;"
2) Set the ENV variable before invoking Hive:
export HIVE_OPTS="-hiveconf mapreduce.map.memory.mb=4096 -hiveconf mapreduce.reduce.memory.mb=5120"
3) Use the "set" command within the hive CLI.
hive> set mapreduce.map.memory.mb=4096;
hive> set mapreduce.reduce.memory.mb=5120;
hive> select count(*) from test_table;
Well, in my case, I also need to set memory in java.opts
set mapreduce.map.memory.mb=4096;
set mapreduce.map.java.opts=-Xmx3686m;
set mapreduce.reduce.memory.mb=4096;
set mapreduce.reduce.java.opts=-Xmx3686m;
For me the below solution works.
Before starting the hive CLI use export HADOOP_CLIENT_OPTS=" -Xmx8192m"
and then launch the cli
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With