I have created an application that reads CSV
files, creates a DataServiceContext
and mass inserts the data to my remote OData
API server.
However 5 minutes after running an import of 30,000
records, I found that the application was still using 750MB of memory!
Is there anything I should do to reduce the memory usage? Or at least make it garbage collect earlier? It doesn't seem to implement IDisposable
, and my google-fu has failed me. Thanks.
I had a similar issue with the Microsoft.OData.Client.DataServiceContext class that looks similar to the System.Data.Services.Client.DataServiceContext.
What was happening was that the DataServiceContext has an EntityTracker that is used to track any changes done to the entities being iterated. After some looking around I saw that DataServiceContext has a MergeOption property. To solve your problem set it to NoTracking like so:
dsc.MergeOption = MergeOption.NoTracking;
This should be done once before starting the enumeration, be it in the constructor or anywhere before the loop.
I've been using this after processing to clear the context, if it helps anyone.
public static void ClearChanges(this DataServiceContext context)
{
foreach (var entity in context.Entities.ToList())
{
context.Detach(entity.Entity);
}
foreach (var link in context.Links.ToList())
{
context.DetachLink(link.Source, link.SourceProperty, link.Target);
}
}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With