What are the risks or performance degradation when using SET TRANSACTION ISOLATION LEVEL REPEATABLE READ?
We have some SP that are called from BizTalk services. We are getting deadlock sometimes. If we change the isolation level, what are the risks and possible perfomance degradations?
Repeatable Read will put locks on all rows that have been fetched. In situations where you are working with cursors fetching large amounts of data this can cause contention with other users because they cannot obtain locks to update any of the rows read by cursors with Repeatable Read until the cursor is closed.
The risk of performance degradation is that transactions may suffer an increased number of timeouts and/or deadlocks. This risk is proportional to the probability that two transactions need to read/update the same rows at the same time. Another factor that can impact your application is the size of lock taken. If locks are taken at a page level then contention may occur if the data different transactions need to access lie on the same page - not necessarily the same row.
On the other hand, when you use a lower isolation level, cursor stability for example, you leave open the possibility that rows you have previously fetched during your transaction may be updated by other transactions before your unit of work has completed.
Try it. There is no way we can tell you what potential risks or performance issues you might run into with a single data point (TRANSACTION ISOLATION LEVEL). We know NOTHING else about your data, data volume, TPS, data dependencies. Whenever performance is a question, try it, measure it. There is no other answer.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With