I have successfully dumped an entire MySQL database using
mysqldump --databases
generating a nice .txt file. However, I can't see how to read the whole file back into MySQL in one go; mysqlimport seems to want just one table at a time.
When you've generated some file (say db-dump.sql
) with mysqldump
, you can import it to your other database with the mysql
command :
mysql --user=XXX --password=XXX --host=YOUR_HOST DATABASE_NAME < db-dump.sql
And, if you don't want the password to appear in a command, you can use :
mysql --user=XXX -p --host=YOUR_HOST DATABASE_NAME < db-dump.sql
As a sidenote, if you want to copy one DB to another one, you don't need to use a file, and can just directly pipe the output of mysqldump
to mysql
:
mysqldump --user=XXX --password=XXX --host=SOURCE_HOST SOURCE_DB | mysql --user=XXX --password=XXX --host=DESTINATION_HOST DESTINATION_DB
(It should even be faster, as you're not using a temporary file that resides on disk)
I do this frequently:
mysqldump -u username -p databasename > dump.sql
To load:
mysql -u username -p targetdatabasename < dump.sql
Switch -p
instructs the database to prompt you to enter the password for the user username
once the command launches.
Your question is probably duplicate though.
You can just use 'source' from within the mysql client:
source dumpfile.sql
Or supply directly from command line:
mysql -u user -p password database < source dumpfile.sql
This is because the result of mysqldump is just a SQL file that can be run via mysql as usual.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With