Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

exporting Hive table to csv in hdfs

Tags:

hadoop

hive

I know there is a known issue with delimiters when saving a table to csv (or other text file) in Hive. So I'm wondering if you guys can help me get around that.

I have an existing table (Table A) and I would like to save it in csv format to hdfs. From reading other responses I believe I would have to first create an external table (but I'm not sure how the whole thing would look).

Can anyone help?

like image 369
Laura Avatar asked May 13 '15 20:05

Laura


People also ask

How do I copy data from Hive table to HDFS?

Hey, Data in Hive tables reside on HDFS, Hive is only a metadata layer that helps you access the HDFS data in for of tables and rows. So you don't transfer data from Hive to HDFS, as the data is already on HDFS.


2 Answers

Try this in hive shell:

INSERT OVERWRITE LOCAL DIRECTORY '/path/to/hive/csv' ROW FORMAT DELIMITED FIELDS TERMINATED BY ',' SELECT * FROM hivetablename;

Change your /path/to/csv to the location where you want to store csv file. hivetablename to your hive table to be stored in csv format.

like image 55
Rajesh N Avatar answered Nov 02 '22 10:11

Rajesh N


For external table in hive, you can follow the below steps:

  1. Create external table in hive

    CREATE EXTERNAL TABLE external_table( number INT, name STRING) ROW FORMAT DELIMITED FIELDS TERMINATED BY ',' LOCATION '/user/hive/external/mytable/';

2. Load the data file from local to HDFS location

hadoop fs -put /home/user1/Desktop/filename.csv /user/hive/external/mytable/

The above two steps can solve your problem.

like image 44
Farooque Avatar answered Nov 02 '22 08:11

Farooque