Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

TRANSACTION ISOLATION LEVEL REPEATABLE READ in SQL Server

What are the risks or performance degradation when using SET TRANSACTION ISOLATION LEVEL REPEATABLE READ?

We have some SP that are called from BizTalk services. We are getting deadlock sometimes. If we change the isolation level, what are the risks and possible perfomance degradations?

like image 763
Mariano Avatar asked Dec 29 '22 04:12

Mariano


2 Answers

Repeatable Read will put locks on all rows that have been fetched. In situations where you are working with cursors fetching large amounts of data this can cause contention with other users because they cannot obtain locks to update any of the rows read by cursors with Repeatable Read until the cursor is closed.

The risk of performance degradation is that transactions may suffer an increased number of timeouts and/or deadlocks. This risk is proportional to the probability that two transactions need to read/update the same rows at the same time. Another factor that can impact your application is the size of lock taken. If locks are taken at a page level then contention may occur if the data different transactions need to access lie on the same page - not necessarily the same row.

On the other hand, when you use a lower isolation level, cursor stability for example, you leave open the possibility that rows you have previously fetched during your transaction may be updated by other transactions before your unit of work has completed.

like image 197
NealB Avatar answered Jan 14 '23 13:01

NealB


Try it. There is no way we can tell you what potential risks or performance issues you might run into with a single data point (TRANSACTION ISOLATION LEVEL). We know NOTHING else about your data, data volume, TPS, data dependencies. Whenever performance is a question, try it, measure it. There is no other answer.

like image 30
jmucchiello Avatar answered Jan 14 '23 11:01

jmucchiello