I'm trying to import a very large .sql file into my database. The file is currently 20GB big. I tried with the console mysql database < backup.sql but this takes now longer than 24 hours and the mysql server made no reaction anymore.
How can I import such a large file? I think splitting it is the way to go, but how can I split it correctly? It is only one table with several insert statements.
Here comes in handy the sqlcmd command line tool. It can import large . sql files, can be run from a batch script, installer, etc. How to use sqlcmd command-line tool to import large .
MySQL has api driven table inserts built into the language. See below. Use this : http://dev.mysql.com/doc/refman/5.1/en/load-data.html
You'll need to reformat the file from insert statements to some form of csv or the like, but this should many orders faster then individual statements because it is a way of communicating to the RDBMS "I'm about to upload a lot of data, save the re-idexing and bookkeeping overhead till the end, make sure you have enough space and grab that space once instead of every time you fill up, make sure you use the appropriate locks etc, etc, etc".
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With