This question is similar, but not the same, as Hive JDBC getConnection does not return . Yet this is about a remote connection. Also the metastore is present in the directory in which the hiveserver2 was started.
We have a running mapr cluster on a remote machine. I would like to connect to Hive on this cluster using Java JDBC.
Hence we started the hive server:
/opt/mapr/hive/hive-0.11/bin/hiveserver2
The output of the server process does not contain any error messages. It listens on port 10000 as reported by netstat.
I try to connect to the server as described in https://cwiki.apache.org/confluence/display/Hive/HiveClient, thereby replacing localhost by the server name where the hiveserver2 is running:
Connection con =
DriverManager.getConnection("jdbc:hive://myserver.example.com:10000/default", "", "");
Yet the program hangs exactly at this statement. It does not seem to get a connection.
Possibly I need to supply a username and password?
Initially I had used the driver org.apache.hadoop.hive.jdbc.HiveDriver.
Yet it seems like I should be using the driver org.apache.hive.jdbc.HiveDriver if the hive2 server is running. Now I am getting the following Exception:
Exception in thread "main" java.sql.SQLException: Could not establish connection to jdbc:hive2://myserver.example.com:10000/default: Required field 'client_protocol' is unset! Struct:TOpenSessionReq(client_protocol:null)
at org.apache.hive.jdbc.HiveConnection.openSession(HiveConnection.java:246)
at org.apache.hive.jdbc.HiveConnection.<init>(HiveConnection.java:132)
at org.apache.hive.jdbc.HiveDriver.connect(HiveDriver.java:105)
at java.sql.DriverManager.getConnection(DriverManager.java:579)
at java.sql.DriverManager.getConnection(DriverManager.java:221)
at HiveJdbcClient.main(HiveJdbcClient.java:22)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at com.intellij.rt.execution.application.AppMain.main(AppMain.java:120)
Caused by: org.apache.thrift.TApplicationException: Required field 'client_protocol' is unset! Struct:TOpenSessionReq(client_protocol:null)
at org.apache.thrift.TApplicationException.read(TApplicationException.java:108)
at org.apache.thrift.TServiceClient.receiveBase(TServiceClient.java:71)
at org.apache.hive.service.cli.thrift.TCLIService$Client.recv_OpenSession(TCLIService.java:144)
at org.apache.hive.service.cli.thrift.TCLIService$Client.OpenSession(TCLIService.java:131)
at org.apache.hive.jdbc.HiveConnection.openSession(HiveConnection.java:237)
... 10 more
The Cloudera JDBC Driver for Apache Hive complies with the JDBC 4.1 and 4.2 data standards. JDBC is one of the most established and widely supported APIs for connecting to and working with databases. At the heart of the technology is the JDBC driver, which connects an application to the database.
I had the same problem and was able to get around it by adding the correct dependency to my pom.xml file. I was getting the latest apache release of hive from maven central and switched to using the cdh4 release from the cloudera repo. So, what you are seeing may be a symptom of having the wrong hive-jdbc dependency. Here's the maven snippet I added to my pom file:
<repository>
<id>cloudera</id>
<url>https://repository.cloudera.com/artifactory/cloudera-repos/</url>
</repository>
...
<dependency>
<groupId>org.apache.hive</groupId>
<artifactId>hive-jdbc</artifactId>
<version>0.10.0-cdh4.3.2</version>
</dependency>
Here's a link about the cloudera repo.
Also, adding the ";auth=noSasl" to the URL made my application hang so I removed it.
I think you need to specify the username. Also itshould be hive2 not hive since you are using hiveserver2. Try modifying your connection url:
Connection con =
DriverManager.getConnection("jdbc:hive2://myserver.example.com:10000/default", "<user>", "");
Its given in the link Hive2
Hope this helps...!!!
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With