Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Transaction deadlocks, how to design properly?

So I'm working on this Entity Framework project that'll be used as kind of a DAL and when running stress tests (starting a couple of updates on entities through Thread()'s) and I'm getting these:

_innerException = {"Transaction (Process ID 94) was deadlocked on lock resources with another process and has been chosen as the deadlock victim. Rerun the transaction."}

Here's some example of how I implemented my classes' methods:

public class OrderController
{

    public Order Select(long orderID)
    {
        using (var ctx = new BackEndEntities())
        {

            try
            {
                var res = from n in ctx.Orders
                                       .Include("OrderedServices.Professional")
                                       .Include("Agency")
                                       .Include("Agent")
                          where n.OrderID == orderID
                          select n;
                return res.FirstOrDefault();
            }
            catch (Exception ex)
            {
                throw ex;
            }
         }
    }

    public bool Update(Order order)
    {
        using (var ctx = new BackEndEntities())
        {
            try
            {
                order.ModificationDate = DateTime.Now;
                ctx.Orders.Attach(order);
                ctx.SaveChanges();
                return true;
            }
            catch (Exception ex)
            {
                throw ex;
            }
        }
    }
}

and:

public class AgentController
{

    public Agent Select(long agentID)
    {
        using (var ctx = new BackEndEntities())
        {
            try
            {
                var res = from n in ctx.Agents.Include("Orders")
                          where n.AgentID == agentID
                          select n;
                return res.FirstOrDefault();
            }
            catch (Exception ex)
            {
                throw ex;
            }
        }

    }

    public bool Update(Agent agent)
    {
        using (var ctx = new BackEndEntities())
        {
            try
            {
                agent.ModificationDate = DateTime.Now;
                ctx.Agents.Attach(agent);
                ctx.ObjectStateManager.ChangeObjectState(agent, System.Data.EntityState.Modified);
                ctx.SaveChanges();
                return true;
            }
            catch (Exception ex)
            {
                throw ex;
            }
        }
    }
}

Obviously, the code here probably could be better but I'm rather of an EF newbie. But I think my problem is rather a design problem with the context.

I remember someone here mentioning that if my context is NOT shared, I won't run into these deadlock issues.

This does not seem 'shared' to me as I do a using new BackEndEntities() in each method, so what do I have to change to make it more robust ?

This DAL will be used in a web service exposed on the internet (after code review of coure) so I have no control on how much it'll be stressed and lots of different instances might want to update the same entity.

Thanks!

like image 322
Francis Ducharme Avatar asked Aug 28 '12 19:08

Francis Ducharme


People also ask

How do you resolve deadlock during transaction processing?

Deadlock frequency can sometimes be reduced by ensuring that all applications access their common data in the same order - meaning, for example, that they access (and therefore lock) rows in Table A, followed by Table B, followed by Table C, and so on.

How does deadlock occur in transaction?

A deadlock occurs if each of two transactions (for example, A and B) needs exclusive use of some resource (for example, a particular record in a data set) that the other already holds. Transaction A waits for the resource to become available.


3 Answers

The reason for thouse deadlocks isn't your code but due to EF that is using SERIALIZABLE for default TransactionScope isolation level.

SERIALIZABLE is the most restricted locking possible, this means that you are by default opting into the most restrictive isolation level, and you can expect a lot of locking!

The solution is to specify another TransactionScope depending on the action you want to perform. You can surround your EF actions with something like this:

using (var scope = new TransactionScope(TransactionScopeOption.Required, new 
        TransactionOptions { IsolationLevel= IsolationLevel.Snapshot }))
{
    // do something with EF here
    scope.Complete();
}

Read more on this issue:

http://blogs.msdn.com/b/diego/archive/2012/04/01/tips-to-avoid-deadlocks-in-entity-framework-applications.aspx

http://blogs.u2u.be/diederik/post/2010/06/29/Transactions-and-Connections-in-Entity-Framework-40.aspx

http://blog.aggregatedintelligence.com/2012/04/sql-server-transaction-isolation-and.html

https://serverfault.com/questions/319373/sql-deadlocking-and-timing-out-almost-constantly

like image 122
Henrik Stenbæk Avatar answered Nov 04 '22 01:11

Henrik Stenbæk


Deadlock freedom is a pretty hard problem in a big system. It has nothing to do with EF by itself.

Shortening the lifetime of your transactions reduces deadlocks but it introduces data inconsistencies. In those places where you were deadlocking previously you are now destroying data (without any notification).

So choose your context lifetime and your transaction lifetime according to the logical transaction, not according to physical considerations.

Turn on snapshot isolation. This takes reading transactions totally out of the equation.

For writing transactions you need to find a lock ordering. Often it is the easiest way to lock pessimistically and at a higher level. Example: Are you always modifying data in the context of a customer? Take an update lock on that customer as the first statement of your transactions. That provides total deadlock freedom by serializing access to that customer.

like image 43
usr Avatar answered Nov 04 '22 02:11

usr


The context is what gives entity its ability to talk to the database, without a context there's no concept of what goes where. Spinning up a context, therefore, is kind of a big deal and it occupies a lot of resources, including external resources like the database. I believe your problem IS the 'new' command, since you would have multiple threads attempting to spin up and grab the same database resource, which definitely would deadlock.

Your code as you've posted it seems to be an anti-pattern. The way it looks, you have your Entity Context spinning up and going out of scope relatively quickly, while your repository CRUD objects seem to be persisting for a much longer time.

The way the companies I have implemented Entity for have traditionally done it exactly the opposite way - the Context is created and is kept for as long as the assembly has need of database, and the repository CRUD objects are created and die in microseconds.

I cannot say where you got your assertion of the context not being shared from so I dunno what circumstances that was said under, but it is absolutely true that you should not share the context across assemblies. Among the same assembly I cannot see any reason why you wouldn't with how many resources it takes to start up a context, and how long it takes to do so. The Entity Context is quite heavy, and if you were to make your current code work by going single-threaded I suspect you would see some absolutely atrocious performance.

So what I would recommend instead is to refactor this so you have Create(BackEndEntites context) and Update(BackEndEntities context), then have your master thread (the one making all these child threads) create and maintain a BackEndEntities context to pass along to its children. Also be sure that you get rid of your AgentControllers and OrderControllers the instant you're done with them and never, ever, ever reuse them outside of a method. Implementing a good inversion of control framework like Ninject or StructureMap can make this a lot easier.

like image 2
tmesser Avatar answered Nov 04 '22 00:11

tmesser