I tried the following script:
LOAD DATA LOCAL INFILE 'myfile.csv'
REPLACE INTO TABLE `mydb`.`mytable`
CHARACTER SET latin1 FIELDS TERMINATED BY ','
OPTIONALLY ENCLOSED BY '"' ESCAPED BY '"'
LINES TERMINATED BY '\r\n'
IGNORE 1 LINES (`field1`, `field1`, `field1`, `field1`, `field1`, `field1`);
when I use a file of 500K records it works, but when I try a csv file of 4 million record it returns:
Query OK, 0 rows affected (2.79 sec)
Records: 0 Deleted: 0 Skipped: 0 Warnings: 0
And of course nothing will be added in 2.70 secs!
My RAM is 4GB and my input file (the large one) is 370MB.
Can anyone suggest a solution?
It's possible that the line endings in the large file are not '\r\n'.
Change the LINES TERMINATED BY '\r\n'
format to '\n'
.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With