I am new to hadoop hive . we are using open source hadoop hive. It is installed on ubuntu a single node cluster. I have 1 million rows of data in a csv file which i moved it from windows to linux. when uploading data into the hive using the foolwing command null values are getting uploaded into the table.
LOAD DATA INPATH '/home/goldstone/Desktop/RejectStats.csv'
OVERWRITE INTO TABLE rejstats;
I even tried to upload the values by tplacing the file hdfs but stil the same issue.
My table structure is as follows:
CREATE TABLE rejstats( amount_requested INT , appdate TIMESTAMP , loan_title STRING , dbt_income_ratio FLOAT , city STRING , state STRING , employment_lenght STRING)
ROW FORMAT
DELIMITED FIELDS TERMINATED BY '\t'
STORED AS TEXTFILE;
I am attaching the screen shot of the null values returned.
please anyone could help me with this issue. Thank You.
Regards, Divya.
I think you are trying to input a comma separated file in a table where you are using FIELDS TERMINATED BY '\t'
i.e tab. Try the following:
CREATE TABLE rejstats (amount_requested INT , appdate TIMESTAMP ,
loan_title STRING , dbt_income_ratio FLOAT , city STRING ,
state STRING , employment_lenght STRING) ROW FORMAT DELIMITED FIELDS
TERMINATED BY ',' STORED AS TEXTFILE;
LOAD DATA INPATH '/home/goldstone/Desktop/RejectStats.csv'
OVERWRITE INTO TABLE rejstats;
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With