Is there a way to get Hive to output the results in a columnar-fashion, like the "\G" option available from MySQL?
http://dev.mysql.com/doc/refman//5.5/en/mysql-commands.html
Bookmark this question. Show activity on this post. ./hive -e "use telecom;insert overwrite local directory '/tmp/result' select avg(a) from abc;" ./hive --hiveconf MY_VAR =`cat /tmp/result/000000_0`;
To directly save the file in HDFS, use the below command: hive> insert overwrite directory '/user/cloudera/Sample' row format delimited fields terminated by '\t' stored as textfile select * from table where id >100; This will put the contents in the folder /user/cloudera/Sample in HDFS. Show activity on this post.
If you use HiveServer2 (Hive > 0.14), you can use "beeline" shell and there is "vertical
" option.
0: jdbc:hive2://127.0.0.1:10000> !set outputformat table
0: jdbc:hive2://127.0.0.1:10000> select * from sample_07 limit 1;
+-----------------+------------------------+----------------------+-------------------+
| sample_07.code | sample_07.description | sample_07.total_emp | sample_07.salary |
+-----------------+------------------------+----------------------+-------------------+
| 00-0000 | All Occupations | 134354250 | 40690 |
+-----------------+------------------------+----------------------+-------------------+
1 row selected (0.131 seconds)
0: jdbc:hive2://127.0.0.1:10000> !set outputformat vertical
0: jdbc:hive2://127.0.0.1:10000> select * from sample_07 limit 1;
sample_07.code 00-0000
sample_07.description All Occupations
sample_07.total_emp 134354250
sample_07.salary 40690
1 row selected (0.063 seconds)
0: jdbc:hive2://127.0.0.1:10000>
No there are no such facility in hive.
The result of map-reduce programs are always displayed row by row.
How ever, you can use Hive/Thrift server and write your hive queries though other scripting language like python and control the display of output. Only disadvantage is that you will have to parse the output and then display it.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With