I'm looking for the fastest way of inserting into Entity Framework.
I'm asking this because of the scenario where you have an active TransactionScope
and the insertion is huge (4000+). It can potentially last more than 10 minutes (default timeout of transactions), and this will lead to an incomplete transaction.
You can add multiple records or multiple objects using the AddRange method of DbSet as shown in the following code. The code creates a list of department objects and inserts two new departments to the list. We add the list to the context using the AddRange method.
Entity Framework loads very slowly the first time because the first query EF compiles the model. If you are using EF 6.2, you can use a Model Cache which loads a prebuilt edmx when using code first; instead, EF generates it on startup.
EF Core 6.0 itself is 31% faster executing queries.
Dapper is literally much faster than Entity Framework Core considering the fact that there are no bells and whistles in Dapper. It is a straight forward Micro ORM that has minimal features as well. It is always up to the developer to choose between these 2 Awesome Data Access Technologies.
To your remark in the comments to your question:
"...SavingChanges (for each record)..."
That's the worst thing you can do! Calling SaveChanges()
for each record slows bulk inserts extremely down. I would do a few simple tests which will very likely improve the performance:
SaveChanges()
once after ALL records.SaveChanges()
after for example 100 records.SaveChanges()
after for example 100 records and dispose the context and create a new one.For bulk inserts I am working and experimenting with a pattern like this:
using (TransactionScope scope = new TransactionScope()) { MyDbContext context = null; try { context = new MyDbContext(); context.Configuration.AutoDetectChangesEnabled = false; int count = 0; foreach (var entityToInsert in someCollectionOfEntitiesToInsert) { ++count; context = AddToContext(context, entityToInsert, count, 100, true); } context.SaveChanges(); } finally { if (context != null) context.Dispose(); } scope.Complete(); } private MyDbContext AddToContext(MyDbContext context, Entity entity, int count, int commitCount, bool recreateContext) { context.Set<Entity>().Add(entity); if (count % commitCount == 0) { context.SaveChanges(); if (recreateContext) { context.Dispose(); context = new MyDbContext(); context.Configuration.AutoDetectChangesEnabled = false; } } return context; }
I have a test program which inserts 560.000 entities (9 scalar properties, no navigation properties) into the DB. With this code it works in less than 3 minutes.
For the performance it is important to call SaveChanges()
after "many" records ("many" around 100 or 1000). It also improves the performance to dispose the context after SaveChanges and create a new one. This clears the context from all entites, SaveChanges
doesn't do that, the entities are still attached to the context in state Unchanged
. It is the growing size of attached entities in the context what slows down the insertion step by step. So, it is helpful to clear it after some time.
Here are a few measurements for my 560000 entities:
The behaviour in the first test above is that the performance is very non-linear and decreases extremely over time. ("Many hours" is an estimation, I never finished this test, I stopped at 50.000 entities after 20 minutes.) This non-linear behaviour is not so significant in all other tests.
This combination increase speed well enough.
context.Configuration.AutoDetectChangesEnabled = false; context.Configuration.ValidateOnSaveEnabled = false;
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With