I'm trying to upload a 95 GB CSV file into a MySQL database (MySQL 5.1.36) via the following command:
CREATE TABLE MOD13Q1 (
rid INT UNSIGNED NOT NULL AUTO_INCREMENT,
gid MEDIUMINT(6) UNSIGNED NOT NULL ,
yr SMALLINT(4) UNSIGNED NOT NULL ,
dyyr SMALLINT(4) UNSIGNED NOT NULL ,
ndvi DECIMAL(7,4) NOT NULL comment 'NA value is 9',
reliability TINYINT(4) NOT NULL comment 'NA value is 9',
ndviquality1 TINYINT(1) NOT NULL ,
ndviquality2 TINYINT(1) NOT NULL ,
primary key (rid),
key(gid)
) ENGINE = MyISAM ;
LOAD DATA INFILE 'datafile.csv' INTO TABLE MOD13Q1 FIELDS TERMINATED by ',' LINES TERMINATED BY '\r\n'
IGNORE 1 LINES
(gid, yr, dyyr, ndvi, reliability,
ndviquality1, ndviquality2
) ;
I'm running this script via DOS at the moment, but the database is not responding. It works for smaller CSV files (1.5 GB) fine. Would it work for this file size?
Do you have any recommendation on how to do this more efficiently/faster? Would engine = CSV be an alternative (indexing not activated! -> so queries might run super slow?).
Update
Thanks for the tips, It worked!
mysql> LOAD DATA INFILE 'E:\\AAJan\\data\\data.csv' INTO TABL
E MOD13Q1
-> FIELDS TERMINATED by ','
-> LINES TERMINATED BY '\r\n'
-> IGNORE 1 LINES
-> (gid, yr, dyyr, ndvi, reliability,
-> ndviquality1, ndviquality2
-> ) ;
Query OK, -1923241485 rows affected (18 hours 28 min 51.26 sec)
Records: -1923241485 Deleted: 0 Skipped: 0 Warnings: 0
mysql>
Hope this is helpful for others avoiding splitting data up in chunks.
No easy way, you will have to split your data in chunks and then import those...
You should disable all the constraints when you are importing. Apart from that I think it should work properly and to be noted that it is going to take a while, probably hours.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With