I'm trying to insert a large collection of objects into a table using fluent NHibernate, using a call to save or update passing each entity from the collection in a foreach loop. I had been setting the batch size to the size of the collection, however I've read on Ayende's blog that setting it to large values is not recommended so I've capped it at 250, however when I view the collection in NHProf, I see a stream of insert statements (the collection is about 20000 items) rather than a set of batched insert calls.
This seems very inefficient and is taking a far longer time than I'd expect for a query that is very basic in essence - inserting a value into 25 columns (yes, that is one place this could be made better, but it's a legacy database that I'm stuck with for now) into a SQL Server 2008 database - so I can only assume I'm doing it wrong.
Is there a recommended way to insert large collections of entities using NHibernate? Is there an efficiency gain to be had by using Save over SaveOrUpdate?
Code of the add method - the SetBatchSize call is where the batch is capped to 250, or set to the collection size if it's less than 250:
public void Add(IEnumerable<TEntity> entities)
{
var session = GetCurrentSession();
using (var transaction = session.BeginTransaction())
{
entities = entities.ToList();
session.SetBatchSize(SetBatchSize(entities.Count()));
foreach (var entity in entities)
session.SaveOrUpdate(entity);
transaction.Commit();
}
}
Apologies for the slightly vague question, I get the feeling I'm just approaching things the wrong way and any pointers would be greatly appreciated!
You'll want to use a StatelessSession for bulk inserts.
(related: Inserts of stateless session of NHibernate are slow)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With