I've got a sql server 2k8 database that's being populated with several hundred thousand records per day.
I'm currently writing some code that's going to make a call to the db, retrieve n records, process them, and write some data back to the db.
There are two ways I see to do this (psuedo code):
function xyz() {
conn = conn creation code
conn.open();
while(not last record) {
select next 1000 records
process each record
last record = true
}
conn.close();
xyz();
}
Essentially creating one connection per batch. The second method:
function xyz() {
conn = connection creation code
while(conn.open();) {
select next 1000 records
process each record
last record = true
}
conn.close();
xyz();
}
I'm curious what's better practice. I feel like it's the latter, but it's also going to have a more-or-less permanent/persistent connection to my db. I'm worried about possible memory overruns or some such.
Thoughts?
Scott
ADO.Net Sqlclient provider (which is I assume you will use, since you say is C#) automatically does connection pooling, see SQL Server Connection Pooling (ADO.NET). Pooled connection are not truly closed when you call Close, they are simply returned to the pool. 'Opening' and 'closing' pooled connection is very fast.
unrelated note: you should embed your connection in a using
block:
using (SqlConnection conn = new SqlConnection(...))
{
conn.Open ();
...
}
This way you avoid leaking connecitons in exception conditions.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With