I have a not-that-large database that I'm trying to migrate to SQL Azure, about 1.6 gigs. I have the BACPAC file in blob storage, start the import, and then... nothing. It gets to 5% as far as the status goes, but otherwise goes nowhere fast. After 10 hours, the database size appears to be 380 MB, and the monitoring shows on average around 53 successful connections per hour, with no failures. It appears to still be going.
I took the same BACPAC to my local machine at home, and I can import the same database in just under three minutes, so I'm making the assumption that the integrity of the file is all good.
What could possibly be the issue here? I can't just keep the app I'm migrating offline for days while this goes on. There must be something fundamentally different that it's choking on.
dacpac is a compressed folder with a . dacpac extension, and similarly a . bacpac is a compressed folder with a . bacpac extension.
Caching Can Improve Application Performance When a user requests data, the cache is checked first before querying a database. Combined with a relational database, a cache can store the most common database queries and return these queries much faster than the database when the application requests them.
Scaling storage up or down in the General Purpose service tier consists of updating meta data and propagating response for submitted request. It is a fast operation that completes in up to 5 minutes, without a downtime and failover.
A BACPAC file is a ZIP file with an extension of BACPAC containing the metadata and data from the database. A BACPAC file can be stored in Azure Blob storage or in local storage in an on-premises location and later imported back into Azure SQL Database, Azure SQL Managed Instance, or a SQL Server instance.
I've yet to try this myself (spent 5 hours so far today waiting for a ~11GB Test database to import to an S3 sized Azure SQL database)...
But MS themselves mention that is due to them not assigning enough hamsters to run the wheel, and suggest using SQLPackage.exe direct from the command line as a workaround for slow imports. (see here: Azure SQL Database and Managed Instance Import/Export service takes a long time to import or export a database)
Interested to see what will happen, I can't wait this long to migrate the Prod DB in a few days' time so will give this tool a try!
UPDATE:
After waiting 4+ hours for my BACPAC file to import, it crashed.
Interestingly, the database from the crashed import was present in the Azure Portal, and I could see the DTU usage (S3 database) had been railed at 100% for almost the whole 4 hours...
So I wondered what happens if I set the DB performance ALL THE WAY UP in the import process, so I deleted the DB from crashed import and started it again, this time with the performance set to P15 (4000DTUs vs. 40 from an S2).
Guess what...
The import took just a little over 3 MINUTES
Turned the DB back down to S3 (what the client has spec'ed and is happy to pay for).
That's 4 hours of my life I'll never get back...
Ugh, I hate answering my own question, but I figured out what the issue is. It appears to have to do with their new basic-standard-premium tiers (announced April 2014). The database imports to the old web-business levels no problem. Because the new tiers are built around throttling for "predictable performance," they make the high volume transactions of importing crazy slow.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With