We recently built a mapping class, called TableMapperT, which encapsulates our Dapper multimap func. We build this separately from our 'command' object, called TableCommand, which holds sql text information and such. In order to use it, however we have to use the 'QueryMultiple' which would also be necessary for returning a single result and then mapping it.
We've run basic performance metrics and the performance appears to be equal to the regular Query api (looping over the same query with the same multimap but with QueryMultiple using 'Read()'.
So the question is, is there a fundamental disadvantage in either performance or behavior in using QueryMultiple for a single recordset? It appears there is not, but thought the community at large might have greater insight.
Sam Saffron indicates that the results are not buffered in this post (Dapper.NET and stored proc with multiple result sets) but that was from a while ago and the source code looks like 'buffered' is now true (public IEnumerable Read(bool buffered = true).
The usage (below) is very clean and allows us to wrangle the connection and error handling in one spot which is our 'Query' extension method on an IDatabase object.
var command = new TableCommand(<SQL>,<Parameters>,<Timeout>);
var mapper = new TableMapper<OrderLineItem>(); //return type here
mapper.SetMap<OrderLineItem,Product>((oli,p)=>{oli.Product = p;return oli}); //input types here
return this.Database.Query(command,mapper);//returns IEnumerable<OrderLineItem>
The issue here is not performance, but convenience. Most queries, IMO, are single result grid. It is more convenient not to have to mess with the additional complexity of the multiple grid reader scenario, which is necessarily more complex because each grid can be a different shape, and they can only be read in a particular order.
Regards buffering:
Read<T>
/ Query<T>
buffer that grid by default, although it can be turned off for fully streamingIf you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With