I'm reading the Excel sheet and applying My business logic and i'm trying to insert using Linq to SQL
.
In my loop i have > (greater than) 5,000 records
and < (Less than) 15,000 records
to Insert.
public List<tblLog> tblLogList = new List<tblLog>();
This Method is Inside the Looping:
public void SaveLog()
{
tblLog tnlog = new tblLog();
tnlog.id = Guid.NewGuid();
tnlog.comp_id = Comp_id;
tnlog.branch_id = Branch_id;
tnlog.row_id = rowID;
tnlog.his_id = his_id;
//Add records to List
tblLogList.Add(tnlog);
Earlier i have tried this code to submit 1 by one:
//_trContext.tblLogs.InsertOnSubmit(tblLog);
//_trContext.SubmitChanges();
Due to the Performance hit i have changed InsertOnSubmit
to InsertAllOnSubmit
if (tblLogList.Count >= 1000)
{
_trContext.tblLogs.InsertAllOnSubmit(tblLogList);
_trContext.SubmitChanges();
tblLogList.Clear();
}
}
Here my question is:
What is the Maximum Number Records i can insert through InserAllOnSubmit() In Linq to Sql.
Through my above code i achieved up to 1000 records but i swear while the code goes for 10,000 or more records it might through some Timeout Exception
since its implemented in my **windows service**
.
I'm realy confused, what were the best suggestions to handle the above logic.?
Thanks in Advance.
While the others correctly state that there are better solutions for this task, a direct answer to your question has not yet been given.
The answer is that technically the number is either limited by the amount of memory the context will consume by swallowing all these new objects, or by the maximum size of the database transaction log. The former is far more likely to be the bottleneck, but you can prevent reaching both limits by committing the records in chunks of several hundreds or thousands and using a new context (!) for each batch.
As you said, a practical limit can be the command time-out (if you don't want to make it eternal). This too can be circumvented by committing in batches and, thus, starting a new command for each batch.
The functional limit depends on things like whether or not the process is unattended (patience), and whether the data are expected to be available within a certain amount of time. If these requirements are tight you may have to resort to one of the solutions suggested in the other answers.
At that volume, I'd be using SqlBulkCopy, perhaps via a DataTable, or perhaps via FastMember's ObjectReader API which creates an IDataReader from a list of types objects, suitable for feeding to SqlBulkCopy.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With