Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Import Big csv file into mySql

Tags:

mysql

csv

I am trying to import 300 mg csv file into mySql table. I am using this command:

LOAD DATA INFILE 'c:/csv/bigCSV.csv' IGNORE 
INTO TABLE table
FIELDS TERMINATED BY ',' 
LINES TERMINATED BY '\r\n'
IGNORE 1 LINES

And it works great with small files (1 mg and so on), but when I try to load a big file like the one mentioned, MySql Workbench (which I use to execute my queries) runs the command, everything ok and green but 0 rows affected. No changes at all in the table.

I am 10000% sure that the table is ok because when I take a portion of that file, eg 1mg and load it into the same table it works fine.

Did anyone had this kind of problems?

Thank you.

like image 259
Adrian Ivasku Avatar asked Nov 26 '15 18:11

Adrian Ivasku


2 Answers

I have "solved" it. Don`t know why and I feel stupid for not playing with the statement earlier but like this:

LOAD DATA INFILE 'c:/csv/eventsbig.csv' IGNORE 
INTO TABLE std9
FIELDS TERMINATED BY ',' 
LINES TERMINATED BY '\n'

Without the "IGNORE 1 LINES" at the end and it works with files of any size.

like image 164
Adrian Ivasku Avatar answered Oct 20 '22 16:10

Adrian Ivasku


LOAD DATA LOW_PRIORITY LOCAL INFILE 'C:\\Learning\\App6_DBconnection\\CC.csv' 
INTO TABLE `test`.`ccd` 
CHARACTER SET armscii8 
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"' ESCAPED BY '"' LINES TERMINATED BY '\r\n' 
IGNORE 1 LINES (`Cd_Type`, `Full_Name`, `Billing_Date`);

This will work even for large data sets of more than 1.5 million records.

like image 44
Vijay Kumar Avatar answered Oct 20 '22 15:10

Vijay Kumar