Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Load data infile for huge file (no warning, no error and no rows affected)

Tags:

file-io

mysql

csv

I tried the following script:

LOAD DATA LOCAL INFILE 'myfile.csv'
    REPLACE INTO TABLE `mydb`.`mytable`
    CHARACTER SET latin1 FIELDS TERMINATED BY ','
    OPTIONALLY ENCLOSED BY '"' ESCAPED BY '"'
    LINES TERMINATED BY '\r\n'
    IGNORE 1 LINES (`field1`, `field1`, `field1`, `field1`, `field1`, `field1`);

when I use a file of 500K records it works, but when I try a csv file of 4 million record it returns:

Query OK, 0 rows affected (2.79 sec) 
Records: 0  Deleted: 0  Skipped: 0  Warnings: 0

And of course nothing will be added in 2.70 secs!

My RAM is 4GB and my input file (the large one) is 370MB.

Can anyone suggest a solution?

like image 605
Afshin Moazami Avatar asked Feb 21 '23 10:02

Afshin Moazami


1 Answers

It's possible that the line endings in the large file are not '\r\n'.

Change the LINES TERMINATED BY '\r\n' format to '\n'.

like image 98
Phil Avatar answered Feb 22 '23 23:02

Phil