Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

When should I call SaveChanges() when creating 1000's of Entity Framework objects? (like during an import)

I am running an import that will have 1000's of records on each run. Just looking for some confirmation on my assumptions:

Which of these makes the most sense:

  1. Run SaveChanges() every AddToClassName() call.
  2. Run SaveChanges() every n number of AddToClassName() calls.
  3. Run SaveChanges() after all of the AddToClassName() calls.

The first option is probably slow right? Since it will need to analyze the EF objects in memory, generate SQL, etc.

I assume that the second option is the best of both worlds, since we can wrap a try catch around that SaveChanges() call, and only lose n number of records at a time, if one of them fails. Maybe store each batch in an List<>. If the SaveChanges() call succeeds, get rid of the list. If it fails, log the items.

The last option would probably end up being very slow as well, since every single EF object would have to be in memory until SaveChanges() is called. And if the save failed nothing would be committed, right?

like image 557
John B Avatar asked Dec 18 '09 22:12

John B


People also ask

How do I use SaveChanges in Entity Framework?

The SaveChanges method of the DbContext prepares the Insert , Update & Delete Queries. It does so by tracking the changes to each of the entities' Context is tracking. Whenever we query the database for entities, the Context retrieves them and mark the entity as Unchanged .

What does the DbContext SaveChanges () method return?

Returns. The number of state entries written to the underlying database.

When would you use SaveChanges false AcceptAllChanges ()?

Sometimes though the SaveChanges(false) + AcceptAllChanges() pairing is useful. The most useful place for this is in situations where you want to do a distributed transaction across two different Contexts. If context1. SaveChanges() succeeds but context2.

What SaveChanges return value in Entity Framework?

SaveChanges() always returns 0 – Entity Framework According to EF documentation, SaveChanges() should return a number of affected rows.


2 Answers

I just optimized a very similar problem in my own code and would like to point out an optimization that worked for me.

I found that much of the time in processing SaveChanges, whether processing 100 or 1000 records at once, is CPU bound. So, by processing the contexts with a producer/consumer pattern (implemented with BlockingCollection), I was able to make much better use of CPU cores and got from a total of 4000 changes/second (as reported by the return value of SaveChanges) to over 14,000 changes/second. CPU utilization moved from about 13 % (I have 8 cores) to about 60%. Even using multiple consumer threads, I barely taxed the (very fast) disk IO system and CPU utilization of SQL Server was no higher than 15%.

By offloading the saving to multiple threads, you have the ability to tune both the number of records prior to commit and the number of threads performing the commit operations.

I found that creating 1 producer thread and (# of CPU Cores)-1 consumer threads allowed me to tune the number of records committed per batch such that the count of items in the BlockingCollection fluctuated between 0 and 1 (after a consumer thread took one item). That way, there was just enough work for the consuming threads to work optimally.

This scenario of course requires creating a new context for every batch, which I find to be faster even in a single-threaded scenario for my use case.

like image 28
Eric J. Avatar answered Oct 22 '22 03:10

Eric J.


I would test it first to be sure. Performance doesn't have to be that bad.

If you need to enter all rows in one transaction, call it after all of AddToClassName class. If rows can be entered independently, save changes after every row. Database consistence is important.

Second option I don't like. It would be confusing for me (from final user perspective) if I made import to system and it would decline 10 rows out of 1000, just because 1 is bad. You can try to import 10 and if it fails, try one by one and then log.

Test if it takes long time. Don't write 'propably'. You don't know it yet. Only when it is actually a problem, think about other solution (marc_s).

EDIT

I've done some tests (time in miliseconds):

10000 rows:

SaveChanges() after 1 row:18510,534
SaveChanges() after 100 rows:4350,3075
SaveChanges() after 10000 rows:5233,0635

50000 rows:

SaveChanges() after 1 row:78496,929
SaveChanges() after 500 rows:22302,2835
SaveChanges() after 50000 rows:24022,8765

So it is actually faster to commit after n rows than after all.

My recommendation is to:

  • SaveChanges() after n rows.
  • If one commit fails, try it one by one to find faulty row.

Test classes:

TABLE:

CREATE TABLE [dbo].[TestTable](     [ID] [int] IDENTITY(1,1) NOT NULL,     [SomeInt] [int] NOT NULL,     [SomeVarchar] [varchar](100) NOT NULL,     [SomeOtherVarchar] [varchar](50) NOT NULL,     [SomeOtherInt] [int] NULL,  CONSTRAINT [PkTestTable] PRIMARY KEY CLUSTERED  (     [ID] ASC )WITH (PAD_INDEX  = OFF, STATISTICS_NORECOMPUTE  = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS  = ON, ALLOW_PAGE_LOCKS  = ON) ON [PRIMARY] ) ON [PRIMARY] 

Class:

public class TestController : Controller {     //     // GET: /Test/     private readonly Random _rng = new Random();     private const string _chars = "ABCDEFGHIJKLMNOPQRSTUVWXYZ";      private string RandomString(int size)     {         var randomSize = _rng.Next(size);          char[] buffer = new char[randomSize];          for (int i = 0; i < randomSize; i++)         {             buffer[i] = _chars[_rng.Next(_chars.Length)];         }         return new string(buffer);     }       public ActionResult EFPerformance()     {         string result = "";          TruncateTable();         result = result + "SaveChanges() after 1 row:" + EFPerformanceTest(10000, 1).TotalMilliseconds + "<br/>";         TruncateTable();         result = result + "SaveChanges() after 100 rows:" + EFPerformanceTest(10000, 100).TotalMilliseconds + "<br/>";         TruncateTable();         result = result + "SaveChanges() after 10000 rows:" + EFPerformanceTest(10000, 10000).TotalMilliseconds + "<br/>";         TruncateTable();         result = result + "SaveChanges() after 1 row:" + EFPerformanceTest(50000, 1).TotalMilliseconds + "<br/>";         TruncateTable();         result = result + "SaveChanges() after 500 rows:" + EFPerformanceTest(50000, 500).TotalMilliseconds + "<br/>";         TruncateTable();         result = result + "SaveChanges() after 50000 rows:" + EFPerformanceTest(50000, 50000).TotalMilliseconds + "<br/>";         TruncateTable();          return Content(result);     }      private void TruncateTable()     {         using (var context = new CamelTrapEntities())         {             var connection = ((EntityConnection)context.Connection).StoreConnection;             connection.Open();             var command = connection.CreateCommand();             command.CommandText = @"TRUNCATE TABLE TestTable";             command.ExecuteNonQuery();         }     }      private TimeSpan EFPerformanceTest(int noOfRows, int commitAfterRows)     {         var startDate = DateTime.Now;          using (var context = new CamelTrapEntities())         {             for (int i = 1; i <= noOfRows; ++i)             {                 var testItem = new TestTable();                 testItem.SomeVarchar = RandomString(100);                 testItem.SomeOtherVarchar = RandomString(50);                 testItem.SomeInt = _rng.Next(10000);                 testItem.SomeOtherInt = _rng.Next(200000);                 context.AddToTestTable(testItem);                  if (i % commitAfterRows == 0) context.SaveChanges();             }         }          var endDate = DateTime.Now;          return endDate.Subtract(startDate);     } } 
like image 182
LukLed Avatar answered Oct 22 '22 04:10

LukLed