Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Inserting 1000s of rows to SQL table using SSMS

I have a sql script with insert statements to insert 1000s of rows (12000 approx.). When i try running the script in SSMS, it throws 'Out of memory' exception after a while.

"An error occurred while executing batch. Error message is: Exception of type 'System.OutOfMemoryException' was thrown."

I have SQL Server 2008 on Vista with 3gb RAM.

Any thoughts or pointers would be appreciated!

like image 861
pencilslate Avatar asked Dec 22 '22 07:12

pencilslate


2 Answers

You will have to split up the commands. The easiest way is to add a GO every 10 lines or so.

Basically the SSMS is trying to load all your text into a single SqlCommand.CommandText and execute it. That won't work.

You need to get it to batch them. GO is an easy split point in SSMS where it will take up to that point and execute it, then continue.

LINE1
LINE2
...
GO

LINE11
LINE12

That will be run in 2 SqlCommands to the database.
If you need them all run in a single transaction you will probably have to write a command line app to load each line and execute it within a transaction. I don't think you can split transactions across executions within SSMS.

You could also build an SSIS package, but that is a LOT of work and I don't recommend it unless you need to repeat this process a every so often.

like image 109
Jason Short Avatar answered Dec 24 '22 20:12

Jason Short


System.OutOfMemoryException is a CLR exception, not a SQL Server exception. SQL would raise an error 701, and besides it would not run out of memory from simply executing some insrts to start with.

The fact that you get a CLR exception indicates that the problem is perhaps in SSMS itself. Make sure your script does not return spurious result sets and messages to SSMS. Also, try executing the script from sqlcmd.

like image 33
Remus Rusanu Avatar answered Dec 24 '22 21:12

Remus Rusanu