I have written this query to create a table on hive. My data is initially in json format, so i have downloaded and build serde and added all jar required for it to run. But i am getting the following error:
FAILED: Execution Error, return code 1 from org.apache.hadoop.hive.ql.exec.DDLTask. Cannot validate serde: org.openx.data.jsonserde.JsonSerDe
QUERY:
create table tip(type string,
text string,
business_id string,
user_id string,
date date,
likes int)
ROW FORMAT SERDE 'org.openx.data.jsonserde.JsonSerDe'
WITH SERDEPROPERTIES("date.mapping"="date")
STORED AS TEXTFILE;
A Serde that provides serialization and deserialization in JSON format. The implementation delegates to underlying JsonSerializer and JsonDeserializer implementations.
The Hive JSON SerDe is commonly used to process JSON data like events. These events are represented as single-line strings of JSON-encoded text separated by a new line. The Hive JSON SerDe does not allow duplicate keys in map or struct key names.
I too encountered this problem. In my case, I managed to fix this issue by adding json-serde-1.3.7-SNAPSHOT-jar-with-dependencies.jar
at hive
command prompt as shown below:
hive> ADD JAR /usr/local/Hive-JSON-Serde/json-serde/target/json-serde-1.3.7-SNAPSHOT-jar-with-dependencies.jar;
Below are the steps I have followed on Ubuntu 14.04:
1. Fire up Linux terminal and cd /usr/local
2. sudo git clone https://github.com/rcongiu/Hive-JSON-Serde.git
3. sudo mvn -Pcdh5 clean package
4. The serde file will be in
/usr/local/Hive-JSON-Serde/json-serde/target/json-serde-1.3.7-SNAPSHOT-jar-with-dependencies.jar
5. Go to hive prompt and ADD JAR file as shown in Step 6.
6. hive> ADD JAR /usr/local/Hive-JSON-Serde/json-serde/target/json-serde-1.3.7- SNAPSHOT-jar-with-dependencies.jar;
7. Now create hive table from hive> prompt. At this stage, Hive table should be created successfully without any error.
Hive
Version: 1.2.1
Hadoop
Version: 2.7.1
Reference: Hive-JSON-Serde
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With