I am trying to execute a large sql script which contains about 1000000 simple UPDATE
queries.
The total size of this script file is about 100 MB.
When I run this script I'm getting an Out Of Memory exception.
When I split the file into chunks of 10 MB I can run each of them.
However, I for convenience I would like to have only one script I can run at once. Is there any statement I can introduce so SQL server releases allocated memory after running each query so I can execute this large script at once?
For example, a SQL Server 2012 Express edition can use only a maximum size of 1.4 GB for its database cache. Other caches (such as the procedure cache, the metadata cache, and so on) can consume memory up to the size specified by the "max server memory" configuration.
SQL Server doesn't move data from memory (the buffer pool) into tempdb in that way. It uses a "least recently used" caching strategy (in general), so if there is memory pressure, and new data needs to be pulled into memory, SQL Server will kick out the LRU data from the buffer pool to accommodate the new data.
If you have not done so already, insert a GO
every thousand statements or so. Otherwise the whole file will be one large batch. SQL Server calculates a single execution plan for a batch which may be pushing you to resource limits.
You might run into another type of resource problem if you run the whole file in the same transaction. The larger the transaction, the more diskspace your TX log file will need to complete the processing of your file.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With