Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Processing large datasets using LINQ

Every time I write a program of the form below using LINQ to SQL, I end up with a program that just grabs more and more memory as it runs and falls over in a heap consuming 2GB after perhaps as little as 25,000 records. I always end up rewriting it using ADO.NET. What am I doing wrong?

Clarification: This question is not about speed of processing; answers about making it go faster are of no relevance.

foreach (int i=0; i<some_big_number; i++)
{
    using (myDC dc = new myDC())  // my DataContext
    {
        myRecord record = (from r in dc.myTable where r.Code == i select r).Single();

        // do some LINQ queries using various tables from the data context
        // and the fields from this 'record'.  i carefully avoid referencing
        // any other data context than 'dc' in here because I want any cached
        // records to get disposed of when 'dc' gets disposed at the end of 
        // each iteration.

        record.someField = newValueJustCalculatedAbove;
        dc.SubmitChanges();
    }
}
like image 285
Nestor Avatar asked Nov 10 '09 02:11

Nestor


1 Answers

You are putting pressure on the data context to generate the query from scratch every time.

Try using a compiled query instead.

like image 128
leppie Avatar answered Sep 30 '22 20:09

leppie