Why should I or shouldn't I use dirty reads:
set transaction isolation level read uncommitted
in SQL Server?
Dirty Reads A dirty read occurs when a transaction reads data that has not yet been committed. For example, suppose transaction 1 updates a row. Transaction 2 reads the updated row before transaction 1 commits the update.
One of the most common problems that occur while running concurrent transactions is the Dirty Read problem. A dirty read occurs when one transaction is permitted to read data that is being modified by another transaction that is running concurrently but which has not yet committed itself.
When a transaction is allowed to read a row that has been modified by an another transaction which is not committed yet that time Dirty Reads occurred. It is mainly occurred because of multiple transaction at a time which is not committed.
Dirty read is a read of uncommitted data. If a particular row is modified by another running application and not yet committed, we also run an application to read the same row with the same uncommitted data. This is the state we say it as a dirty read.
From MSDN:
When this option is set, it is possible to read uncommitted or dirty data; values in the data can be changed and rows can appear or disappear in the data set before the end of the transaction.
Simply put, when you are using this isolation level, and you are performing multiple queries on an active table as part of one transaction, there is no guarantee that the information returned to you within different parts of the transaction will remain the same. You could query the same data twice within one transaction and get different results (this might happen in the case where a different user was updating the same data in the midst of your transaction). This can obviously have severe ramifications for parts of your application that rely on data integrity.
Generally when you need to do a sizeable (or frequent) queries to busy tables, where read committed would possibly be blocked by locks from uncommited transactions, but ONLY when you can live with inaccurate data.
As an example, on a gaming web site I worked on recently there was a summary display of some stats about recent games, this was all based on dirty reads, it was more important for us to include then exclude the transactional data not yet committed (we knew anyway that few, if any, transactions would be backed out), we felt that on average the data would be more accurate that way.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With