We have code that reads in a set of SQL script file, and after doing some processing on them splits them into batches by finding the “GO” keyword and then sends each batch to Sql Server using a separate SqlCommon.
Is there a better way of doing this so we:
(The batches are mostly creating tables, index, views and stored procs. Speed is an issue as our integration tests calls the code often. The sql-server common line tools may not be installed on the machine that is running this code.)
2 ideas...
Package the script and modify it to be runnable as a dynamic SQL snippet. Upload the entire batch in a stream using nvarchar(max) and run it using sp_executesql on the server side. Since you control it for integration testing, dynamic SQL is not much of an issue.
Upload the entire batch to the server as a nvarchar(max). Save the file on the SQL Server machine using xp_cmdshell or CLR or other. Again using xp_cmdshell, use sqlcmd to run the script file.
With any sort of batching, you lose some way of identifying exactly where it broke, since you are automating integration testing after all.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With