I'd like to get the total count of results and top n rows of some query - is it possible in one statement?
I'd expect the results as:
count(..) column1 column2
125 some_value some_value
125 some_value some_value
Thank you in advance!
Like this:
SELECT TOP 100 --optional
MC.Cnt, M.Column1, M.Column2
FROM
myTable M
CROSS JOIN
(SELECT COUNT(*) AS Cnt FROM myTable) MC
Edit: After downvote and COUNT/OVER answer. A comparison on 2 tables of mine
You can see a huge difference between my CROSS JOIN/simple aggregate and a COUNT/empty ORDER BY clause
SELECT COUNT(*) OVER() AS C, key1col, key2col
FROM myTable
(24717 row(s) affected)
Table 'Worktable'. Scan count 3, logical reads 49865, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'myTable'. Scan count 1, logical reads 77, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
StmtText
|--Nested Loops(Inner Join)
|--Table Spool
| |--Segment
| |--Index Scan(OBJECT:([MyDB].[dbo].[myTable].[IX_useful]))
|--Nested Loops(Inner Join, WHERE:((1)))
|--Compute Scalar(DEFINE:([Expr1003]=CONVERT_IMPLICIT(int,[Expr1005],0)))
| |--Stream Aggregate(DEFINE:([Expr1005]=Count(*)))
| |--Table Spool
|--Table Spool
SELECT
MC.Cnt, M.key1col, M.key2col
FROM
myTable M
CROSS JOIN
(SELECT COUNT(*) AS Cnt FROM myTable) MC
(24717 row(s) affected)
Table 'myTable'. Scan count 2, logical reads 154, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
StmtText
|--Nested Loops(Inner Join)
|--Compute Scalar(DEFINE:([Expr1005]=CONVERT_IMPLICIT(int,[Expr1009],0)))
| |--Stream Aggregate(DEFINE:([Expr1009]=Count(*)))
| |--Index Scan(OBJECT:([MyDB].[dbo].[myTable].[IX_useful]))
|--Index Scan(OBJECT:([MyDB].[dbo].[myTable].[IX_useful] AS [M]))
I've repeated this on a table with 570k rows and here is the IO
Table 'Worktable'. Scan count 3, logical reads 1535456, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'myTable'. Scan count 1, logical reads 2929, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
Table 'myTable'. Scan count 34, logical reads 6438, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.
what about
SELECT COUNT(*) OVER() AS C, COLUMN1, COLUMN2
FROM TABLE
Regarding CROSS JOIN
queries
In a heavy INSERT
/DELETE
environment, the cross join will return incorrect row count.
Try this from multiple connections
connection 1
set nocount on;
drop table dbo.test_table;
GO
create table dbo.test_table
(
id_field uniqueidentifier not null default(newid()),
filler char(2000) not null default('a')
);
GO
create unique clustered index idx_id_fld on dbo.test_table(id_field);
GO
while 1 = 1
insert into dbo.test_table default values;
connection 2
select T2.cnt, T1.id_field, T1.filler
from dbo.test_table T1
cross join (select COUNT(*) as cnt from dbo.test_table) T2
select T2.cnt, T1.id_field, T1.filler
from dbo.test_table T1
cross join (select COUNT(*) as cnt from dbo.test_table) T2
select T2.cnt, T1.id_field, T1.filler
from dbo.test_table T1
cross join (select COUNT(*) as cnt from dbo.test_table) T2
Each time, the count of records (@@ROWCOUNT
) is different to T2.cnt
In the case of COUNT(*) OVER()
, there is only a single table scan and the @@ROWCOUNT
is always the same as T2.cnt
Regarding query plans - SQL 2005 SP3 appears to be much weaker at doing COUNT(*) OVER()
than SQL 2008 R2. On top of that, it incorrectly reports query costs (I never thought a sub query could cost more than 100% of the entire query).
In a lot of scenarios, the cost of the COUNT(*) OVER()
is between 50-75% of the CROSS JOIN
The best case scenario for a cross join would be if there was a very narrow index to do the count on. That way there will be a clustered index scan for the data + an index scan for the count.
As always, it's best to measure, measure, measure and go with the compromise that you're happy to live with.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With