Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Large mysql dump failed to be imported

I have mysql dump called dbBACKUP.sql about 152 MB. It has thousands of records. I use the following command to import it:

mysql -u root -p --default-character-set=utf8 phpbb3 < dbBACKUP.sql

After supplying the password of the root, I go to phpmyadmin to check the data imported to the database. I notice slow increase in the amount of data imported by the changes in both total rows number and the total size in MB, till point it seems there is no any change with each refresh of phpmyadmin.

I believe this is may be a memory issue for the server. Is there any configuration settings allow to increase the memory used by MySQL. Or even any solution to increase the performance of this task?

This is occured on my own desktop based on Windows 7 64bit. The Mysql server is 5.6.16

like image 431
SaidbakR Avatar asked Oct 18 '25 14:10

SaidbakR


1 Answers

I did a google search and found this blog that might help you: http://cmanios.wordpress.com/2013/03/19/import-a-large-sql-dump-file-to-a-mysql-database-from-command-line/

-- connect to database
mysql -u root -p

-- set network buffer length to a large byte number.
set global net_buffer_length=1048576;

-- set maximum allowed packet size to a large byte number.
set global max_allowed_packet=1073741824;

-- set read buffer size to a large byte number.
set global read_buffer_size=2147479552;

-- disable foreign key checking to avoid delays,errors and unwanted behaviour 
SET foreign_key_checks = 0;

-- import your sql dump file
source C:\[your_path]\dbBACKUP.sql

-- enable foreign key checking after import
SET foreign_key_checks = 1;
like image 187
oerl Avatar answered Oct 21 '25 03:10

oerl