Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

hive multi table join with same condition error

Tags:

hadoop

hive

I am running several scripts and I keep getting this same error. All of them are multi tables join with the same condition.

Data is stored as parquet.

Hive version 1.2.1 / MR

SELECT count(*) 
FROM   xxx.tmp_usr_1 m
INNER JOIN xxx.tmp_usr n
ON m.date_id = n.date_id AND m.end_user_id = n.end_user_id
LEFT JOIN xxx.usr_2 p
ON m.date_id = p.date_id AND m.end_user_id = p.end_user_id;

Here is the error message:

2017-01-22 16:47:55,208 Stage-1 map = 54%, reduce = 0%, Cumulative CPU 560.81 sec 2017-01-22 16:47:56,248 Stage-1 map = 58%, reduce = 0%, Cumulative CPU 577.74 sec 2017-01-22 16:47:57,290 Stage-1 map = 100%, reduce = 100%, Cumulative CPU 446.32 sec MapReduce Total cumulative CPU time: 7 minutes 26 seconds 320 msec Ended Job = job_1484710871657_6350 with errors Error during job, obtaining debugging information... Examining task ID: task_1484710871657_6350_m_000061 (and more) from job job_1484710871657_6350 Examining task ID: task_1484710871657_6350_m_000069 (and more) from job job_1484710871657_6350 Examining task ID: task_1484710871657_6350_m_000053 (and more) from job job_1484710871657_6350 Examining task ID: task_1484710871657_6350_m_000011 (and more) from job job_1484710871657_6350 Examining task ID: task_1484710871657_6350_m_000063 (and more) from job job_1484710871657_6350 Examining task ID: task_1484710871657_6350_m_000049 (and more) from job job_1484710871657_6350 Examining task ID: task_1484710871657_6350_m_000052 (and more) from job job_1484710871657_6350 Task with the most failures(4): ----- Task ID: task_1484710871657_6350_m_000071 URL: http://xxxxxxxxxx/taskdetails.jsp?jobid=job_1484710871657_6350&tipid=task_1484710871657_6350_m_000071 ----- Diagnostic Messages for this Task: Error: java.io.IOException: java.lang.reflect.InvocationTargetException at org.apache.hadoop.hive.io.HiveIOExceptionHandlerChain.handleRecordReaderCreationException(HiveIOExceptionHandlerChain.java:97) at org.apache.hadoop.hive.io.HiveIOExceptionHandlerUtil.handleRecordReaderCreationException(HiveIOExceptionHandlerUtil.java:57) at org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.initNextRecordReader(HadoopShimsSecure.java:266) at org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.(HadoopShimsSecure.java:213) at org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileInputFormatShim.getRecordReader(HadoopShimsSecure.java:333) at org.apache.hadoop.hive.ql.io.CombineHiveInputFormat.getRecordReader(CombineHiveInputFormat.java:719) at org.apache.hadoop.mapred.MapTask$TrackedRecordReader.(MapTask.java:169) at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:432) at org.apache.hadoop.mapred.MapTask.run(MapTask.java:343) at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:163) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:422) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1671) at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:158) Caused by: java.lang.reflect.InvocationTargetException at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.lang.reflect.Constructor.newInstance(Constructor.java:422) at org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.initNextRecordReader(HadoopShimsSecure.java:252) ... 11 more Caused by: java.lang.IllegalStateException: Invalid schema data type, found: PRIMITIVE, expected: STRUCT at org.apache.hadoop.hive.ql.io.parquet.read.DataWritableReadSupport.getProjectedGroupFields(DataWritableReadSupport.java:118) at org.apache.hadoop.hive.ql.io.parquet.read.DataWritableReadSupport.getSchemaByName(DataWritableReadSupport.java:156) at org.apache.hadoop.hive.ql.io.parquet.read.DataWritableReadSupport.init(DataWritableReadSupport.java:222) at org.apache.hadoop.hive.ql.io.parquet.read.ParquetRecordReaderWrapper.getSplit(ParquetRecordReaderWrapper.java:256) at org.apache.hadoop.hive.ql.io.parquet.read.ParquetRecordReaderWrapper.(ParquetRecordReaderWrapper.java:99) at org.apache.hadoop.hive.ql.io.parquet.read.ParquetRecordReaderWrapper.(ParquetRecordReaderWrapper.java:85) at org.apache.hadoop.hive.ql.io.parquet.MapredParquetInputFormat.getRecordReader(MapredParquetInputFormat.java:72) at org.apache.hadoop.hive.ql.io.CombineHiveRecordReader.(CombineHiveRecordReader.java:67) ... 16 more Container killed by the ApplicationMaster. Container killed on request. Exit code is 143 Container exited with a non-zero exit code 143

My data consists of about 20M records. When I try to join the tables with one column (end_user_id), I get the same error.

The join columns are the same data type. A join B as a subquery and then join C can solve this issue .

We have many SQL queries with multi-table join statements with the same condition, but only a few SQL scripts encounter these errors.

like image 384
Mike Gan Avatar asked Oct 30 '22 13:10

Mike Gan


1 Answers

Make sure matching column data types should be same for all the tables

like image 187
Kanagaraj Dhanapal Avatar answered Nov 15 '22 10:11

Kanagaraj Dhanapal