I have a decent size (2.6g) sql file. I can't get the dump by doing exclusion of tables because this is on production and our hosting people don't want us to do dumps, since this locks the db files. They have replicated the database and provided me with a sql file and I was looking around to see if there was a way to ignore tables so that I can leave out the very large tables. I don't need them since I have local data and they are not the reason I need the backup.
I don't want to import this into another db and do anything with it because the whole purpose of what I am looking for is a quicker way to import the data. It takes forever to import 2.6 gigs.
Going through the file and editing out the tables I don't want would seem to take a long time as well. My assumption is there isn't a way to do this but figured I would post and see.
By default, the mysqldump utility, which allows to back a MySQL database, will perform a lock on all tables until the backup is complete.
To dump/export a MySQL database, execute the following command in the Windows command prompt: mysqldump -u username -p dbname > filename. sql . After entering that command you will be prompted for your password.
To dump large tables, you could combine the following two flags, --single-transaction and --quick .
Click the database name on the left-hand side of the page. Select the Import tab. Click the browse button under “File to Import”, then select the database file from your computer. Click Go to import the database.
You'll need to parse the dump files to grab just the tables you need. Here is a pretty decent script that does this rather well:
http://kedar.nitty-witty.com/blog/mydumpsplitter-extract-tables-from-mysql-dump-shell-script
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With