I am backing up a database whose size is about 190 GB. I want to back up the database to a local file. This is the command I am using:
mysqldump -u root -p tradeData > /db_backup/tradeData.sql
I have enough space on my machine. I tried a bunch of times and got no errors, but I am always getting a result file whose size is around 122GB.
Does anyone have experience with backing up large databases? My machine is a Linux one.
To dump entire databases, do not name any tables following db_name , or use the --databases or --all-databases option. To see a list of the options your version of mysqldump supports, issue the command mysqldump --help .
Mysqldump is a command-line utility that is used to generate the logical backup of the MySQL database. It produces the SQL Statements that can be used to recreate the database objects and data. The command can also be used to generate the output in the XML, delimited text, or CSV format.
It took a total of 1 minute 27 seconds to take a dump of the entire database (same data as used for mysqldump) and also it shows its progress which will be really helpful to know how much of the backup has completed.
Using information like the SQL query here won’t give you a one-to-one connection between your local DB dump and what is actually in the system. Actual DBs have indexes and data that only exist when the DB is actually a DB in the database. As RolandoMySQLDBA
explains:
From the dump file size, it is hard to judge because the combined total size of data pages and index pages maybe far less that the size of ibdata1 the dump was created from.
So my guess is your database includes InnoDB tables among other things that bloat the DB when compared to a bare dump.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With