I've tried searching through search engines,MSDN,etc. but can't anything. Sorry if this has been asked before. Is there any performance difference between using the T-SQL "Between" keyword or using comparison operators?
C. != is much faster. Anything with an exclamation point is given top priority in the Oracle engine.
We can compare the execution plan of the already saved query execution plan as well. To do so, just open the query execution plan in SQL Server Management Studio 2016. Once opened, right click on the execution plan, and click on the Showplan compare.
Between is faster due to lesser comparisons. With IN clause each elements are traversed every time. But purpose of both are different: Between is used when you are comparing with Range of values in some kind of sequence.
Between is technically greater than and equal to PLUS less than and equal to.
You can check this easily enough by checking the query plans in both situations. There is no difference of which I am aware. There is a logical difference though between BETWEEN and "<" and ">"... BETWEEN is inclusive. It's equivalent to "<=" and "=>".
The query engine converts between into >=
and <=
(take a look at the query plan) so in practise they're identical and in theory >= <=
is faster because the engine won't have to translate. Good luck noticing a difference though.
I use between anyway, I find it reads easier
Very complex queries/nested views with numerous between comparisons might benefit from changing into >= <=
as this might potentially prevent optimisation timeouts by reducing the time spent on refactoring the query (just a theory, untested by me & I've never noticed it)
Love when folks give code to do your own testing, you need to do a larger subset / repeated test to account for indexes being loaded into memory, etc... before jumping to conclusions though. Here is the same code with a larger table and 10 iterations
DECLARE
@Startdatetime datetime ,
@Diff int = 0 ,
@Addrowcount int = 1000 ,
@ptr int = 1;
SET NOCOUNT ON;
--Create a tempory table to perform our tests on
DROP TABLE dbo.perftest
CREATE TABLE dbo.perftest( id int NOT NULL
IDENTITY(1 , 1)
PRIMARY KEY ,
mytext nvarchar( 50 )NOT NULL );
--Now add some sample rows
SET @Addrowcount = 20000;
WHILE(@Addrowcount > 0)
BEGIN
INSERT INTO dbo.perftest( mytext )
VALUES( 'thetext' );
SET @Addrowcount = @Addrowcount - 1;
END;
WHILE @ptr < 10 -- do this a few times to account for indexes being loaded into memory
BEGIN
SELECT @Startdatetime = GETDATE();
-- do method 1 here
SELECT mytext
FROM dbo.perftest
WHERE(id >= (100 + (@ptr * 1000)))
AND (id <= (500 + (@ptr * 1000)));
--end method1
SELECT @Diff = DATEDIFF( millisecond , @Startdatetime , GETDATE());
PRINT ':Method 1: ' + CAST(@Diff AS nvarchar( 20 )) + ' ms';
--reset start time
SELECT @Startdatetime = GETDATE();
--do method2 here
SELECT mytext
FROM dbo.perftest
WHERE id BETWEEN (300 + (@ptr * 1000))
AND (800 + (@ptr * 1000));
--end method2
SELECT @Diff = DATEDIFF( millisecond , @Startdatetime , GETDATE());
PRINT ':Method 2: ' + CAST(@Diff AS nvarchar( 20 )) + ' ms';
SET @ptr = @ptr + 1
END
Gives you a very different set of results:
--Method 1 -- 10 ms
--Method 2 -- 33 ms
--Method 1 -- 40 ms
--Method 2 -- 26 ms
--Method 1 -- 23 ms
--Method 2 -- 23 ms
--Method 1 -- 13 ms
--Method 2 -- 16 ms
--Method 1 -- 13 ms
--Method 2 -- 20 ms
--Method 1 -- 6 ms
--Method 2 -- 16 ms
--Method 1 -- 26 ms
--Method 2 -- 16 ms
--Method 1 -- 13 ms
--Method 2 -- 13 ms
--Method 1 -- 16 ms
--Method 2 -- 13 ms
I would say from this (still pretty unscientific) test, not much difference either way.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With