Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why is import of SQL so slow?

I have an SQL file containing two tables with around 600,000 rows altogether. Yesterday, I tried to import the file into my MySQL database on Fedora 16, and it took over 2 hours to import the file. On my Windows PC it took 7 minutes. My Linux and Windows machines have exactly the same hardware. A couple of my friends tried it too, and they had a similar experience.

The command we were using was: mysql -u root database_name < sql_file.sql.

Why is there such a difference in speed?

like image 919
Lars Steen Avatar asked Apr 19 '12 11:04

Lars Steen


People also ask

Why is my SQL database so slow?

If your database is being used in high volumes, this can slow the database down. When there are too many queries to process at once, the CPU will bottleneck, resulting in a slow database.

What is Bacpac file in SQL Server?

A BACPAC file is a ZIP file with an extension of BACPAC containing the metadata and data from the database. A BACPAC file can be stored in Azure Blob storage or in local storage in an on-premises location and later imported back into Azure SQL Database, Azure SQL Managed Instance, or a SQL Server instance.

How long does it take to export a database?

The Export Database process can take from a few hours to several days (in extreme cases). You can export only one database at a time.


2 Answers

My bet is that Fedora 16 is honoring the transaction/sync semantics and Windows is not. If you do the math, 600,000 updates in two hours is 5,000 per minute. That's the same order of magnitude as a disk's rotation rate.

You can try adding SET autocommit=0; to the beginning of your import file and COMMIT; to the end. See this page for more information.

like image 63
David Schwartz Avatar answered Oct 16 '22 16:10

David Schwartz


Why don't you export .sql file as BULK INSERT option and import it, try these options while taking a backup using mysqldump

--extended-insert: use multiple-row insert statements

--quick: do not do buffering of row data, good if tables are large

Note: Make sure you should increase value of max_allowed_packet=32M or more in my.cnf file before generating .sql file.

like image 26
Mahesh Patil Avatar answered Oct 16 '22 17:10

Mahesh Patil