I have a couple hundred line stored procedure that takes a single parameter (@id
) and is heavily simplified to something like:
SET TRANSACTION ISOLATION LEVEL READ UNCOMMITTED;
INSERT INTO #new_result
EXEC pr_do_a_ton_of_calculations
DELETE FROM result WHERE id = @id
INSERT INTO result
SELECT * FROM #new_result
Multiple processes may invoke this procedure concurrently, with the same parameters. I'm experiencing that both executions delete the rows one after the other, and then try to insert the same data one after the other. The result is that one errors out, because it's inserting duplicate data and violating a unique constraint.
Ideally, I'd like to ensure that two connections executing the procedure with the same @id
parameter will execute both the DELETE
and INSERT
serially, without locking the entire table. It's also fine if the two procedures are completely serialized, as long as they aren't preventing the execution of other invocations with a different parameter.
Is there any way I can achieve this?
Add this to the beginning of your stored procedure:
DECLARE @lid INT
SELECT @lid = id
FROM result WITH (UPDLOCK, ROWLOCK)
WHERE id = @id
and get rid of the READ UNCOMMITTED
above.
Make sure your id
is indexed. If it's a reference to another table where it is a PRIMARY KEY
, use the lock on that table instead.
Better yet, use application locks (sp_getapplock).
You can use application locks, for example:
DECLARE @ResourceName VARCHAR(200) = 'MyResource' + CONVERT(VARCHAR(20), @id)
EXEC sp_getapplock @Resource = @ResourceName, @LockMode = 'Exclusive'
---- Do your thing ----
DECLARE @ResourceName VARCHAR(200) = 'MyResource' + CONVERT(VARCHAR(20), @id)
EXEC sp_releaseapplock @Resource = @ResourceName, @LockMode = 'Exclusive'
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With