Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Export a large MySQL table as multiple smaller files

Tags:

mysql

export

I have a very large MySQL table on my local dev server: over 8 million rows of data. I loaded the table successfully using LOAD DATA INFILE.

I now wish to export this data and import it onto a remote host.

I tried LOAD DATA LOCAL INFILE to the remote host. However, after around 15 minutes the connection to the remote host fails. I think that the only solution is for me to export the data into a number of smaller files.

The tools at my disposal are PhpMyAdmin, HeidiSQL and MySQL Workbench.

I know how to export as a single file, but not multiple files. How can I do this?

like image 667
Billy Avatar asked Oct 28 '12 20:10

Billy


People also ask

How big is too big for a MySQL table?

You are using a MyISAM table and the space required for the table exceeds what is permitted by the internal pointer size. MyISAM permits data and index files to grow up to 256TB by default, but this limit can be changed up to the maximum permissible size of 65,536TB (2567 − 1 bytes).


3 Answers

I just did an import/export of a (partitioned) table with 50 millions record, it needed just 2 minutes to export it from a reasonably fast machine and 15 minutes to import it on my slower desktop. There was no need to split the file.

mysqldump is your friend, and knowing that you have a lot of data it's better to compress it

 @host1:~ $ mysqldump -u <username> -p <database> <table> | gzip > output.sql.gz
 @host1:~ $ scp output.sql.gz host2:~/
 @host1:~ $ rm output.sql.gz
 @host1:~ $ ssh host2
 @host2:~ $ gunzip < output.sql.gz | mysql -u <username> -p <database>
 @host2:~ $ rm output.sql.gz
like image 136
Riccardo Galli Avatar answered Sep 23 '22 04:09

Riccardo Galli


If you are not comfortable with using the mysqldump command line tool, here are two GUI tools that can help you with that problem, although you have to be able to upload them to the server via FTP!

Adminer is a slim and very efficient DB Manager tool that is at least as powerful as PHPMyAdmin and has only ONE SINGLE FILE that has to be uploaded to the server which makes it extremely easy to install. It works way better with large tables / DB than PMA does.

MySQLDumper is a tool developed especially to export / import large tables / DBs so it will have no problem with the situation you describe. The only dowside is that it is a bit more tedious to install as there are more files and folders (~350 files in ~1.5MB), but it shouldn't be a problem to upload it via FTP either, and it will definately get the job done :)

So my advice would be to first try Adminer and if that one also fails go the MySQLDumper route.

like image 29
Larzan Avatar answered Sep 23 '22 04:09

Larzan


Take a look at mysqldump

Your lines should be (from terminal):

export to backupfile.sql from db_name in your mysql:

mysqldump -u user -p db_name > backupfile.sql

import from backupfile to db_name in your mysql:

mysql -u user -p db_name < backupfile.sql

You have two options in order to split the information:

  1. Split the output text file into smaller files (as many as you need, many tools to do this, e.g. split).
  2. Export one table each time using the option to add a table name after the db_name, like so:

    mysqldump -u user -p db_name table_name > backupfile_table_name.sql

Compressing the file(s) (a text file) is very efficient and can minimize it to about 20%-30% of it's original size.

Copying the files to remote servers should be done with scp (secure copy) and interaction should take place with ssh (usually).

Good luck.

like image 29
Reut Sharabani Avatar answered Sep 24 '22 04:09

Reut Sharabani