The standard mysqldump command that I use is
mysqldump --opt --databases $dbname --host=$dbhost --user=$dbuser --password=$dbpass | gzip > $filename
To dump multiple databases
mysqldump --opt --databases $dbname1 $dbname2 $dbname3 $dbname_etc --host=$dbhost --user=$dbuser --password=$dbpass | gzip > $filename
My question is how do you dump multiple databases from different MySQL accounts into just one file?
UPDATE: When I meant 1 file, I mean 1 gzipped file with the difference sql dumps for the different sites inside it.
To backup multiple MySQL databases with one command you need to use the --database option followed by the list of databases you want to backup. Each database name must be separated by space. The command above will create a dump file containing both databases.
mysqlpump is the 4th fastest followed closer by mydumper when using gzip. mysqldump is the classic old-school style to perform dumps and is the slowest of the four tools. In a server with more CPUs, the potential parallelism increases, giving even more advantage to the tools that can benefit from multiple threads.
You can view all existing users with the following query: SELECT * FROM mysql. user; Knowing this, it's pretty obvious that mysqldump shouldn't do anything with users.
For every MySQL server account, dump the databases into separate files
For every dump file, execute this command:
cat dump_user1.sql dump_user2.sql | gzip > super_dump.gz
There is a similar post on Superuser.com website: https://superuser.com/questions/228878/how-can-i-concatenate-two-files-in-unix
Nobody seems to have clarified this, so I'm going to give my 2 cents.
Going to note here, my experiences are in BASH, and may be exclusive to it, so variables and looping might work different in your environment.
The best way to achieve an archive with separate files inside of it is to use either ZIP or TAR, i prefer to use tar due to its simplicity and availability.
Tar itself doesn't do compression, but bundled with bzip2 or gzip it can provide excellent results. Since your example uses gzip I'll use that in my demonstration.
First, let's attack the problem of MySQL dumps, the mysqldump
command does not separate the files (to my knowledge anyway). So let's make a small workaround for creating 1 file per database.
mysql -s -r -p$dbpass --user=$dbuser -e 'show databases' | while read db; do mysqldump -p$dbpass --user=$dbuser $db > ${db}.sql; done
So now we have a string that will show databases per file, and export those databases out to where ever you need simply edit the part after the > symbol
Next, let's add some look at the syntax for TAR
tar -czf <output-file> <input-file-1> <input-file-2>
because of this configuration it allows us to specify a great number of files to archive.
The options are broken down as follows.
c - Compress/Create Archive
z - GZIP Compression
f - Output to file
j - bzip compression
Our next problem is keeping a list of all the newly created files, we'll expand our while statement to append to a variable while running through each database found inside of MySQL.
DBLIST=""; mysql -s -r -p$dbpass --user=$dbuser -e 'show databases' | while read db; do mysqldump p$dbpass --user=$dbuser $db > ${db}.sql; DBLIST="$DBLIST $DB"; done
Now we have a DBLIST variable that we can use to have an output of all our files that will be created, we can then modify our 1 line statement to run the tar command after everything has been handled.
DBLIST=""; mysql -s -r -p$dbpass --user=$dbuser -e 'show databases' | while read db; do mysqldump p$dbpass --user=$dbuser $db > ${db}.sql; DBLIST="$DBLIST $DB"; done && tar -czf $filename "$DBLIST"
This is a very rough approach and doesn't allow you to manually specify databases, so to achieve that, using the following command will create you a TAR file that contains all of your specified databases.
DBLIST=""; for db in "<database1-name> <database2-name>"; do mysqldump -p$dbpass --user=$dbuser $db > ${db}.sql; DBLIST="$DBLIST $DB.sql"; done && tar -czf $filename "$DBLIST"
The looping through MySQL databases from the MySQL database comes from the following stackoverflow.com question "mysqldump with db in a separate file" which was simply modified in order to fit your needs.
And to have the script automatically clean it up in a 1 liner simply add the following at the end of the command
&& rm "$DBLIST"
making the command look like this
DBLIST=""; for db in "<database1-name> <database2-name>"; do mysqldump -p$dbpass --user=$dbuser $db > ${db}.sql; DBLIST="$DBLIST $DB.sql"; done && tar -czf $filename "$DBLIST" && rm "$DBLIST"
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With