I have a table similar to the one shown. It contains a list of user ids, the hour value for each hour of the day and an Avail flag to determine if that user is available on that hour.
I need to list all User ids which are available for a number of consecutive hours defined as @n
#####################
# UID # Avail # Hour#
#####################
# 123 # 1 # 0 #
# 123 # 1 # 1 #
# 123 # 0 # 2 #
# 123 # 0 # 3 #
# 123 # 0 # 4 #
# 123 # 1 # 5 #
# 123 # 1 # 6 #
# 123 # 1 # 7 #
# 123 # 1 # 8 #
# 341 # 1 # 0 #
# 341 # 1 # 1 #
# 341 # 0 # 2 #
# 341 # 1 # 3 #
# 341 # 1 # 4 #
# 341 # 0 # 5 #
# 341 # 1 # 6 #
# 341 # 1 # 7 #
# 341 # 0 # 8 #
######################
This should result in the following output for @n=3
#######
# UID #
#######
# 123 #
#######
I have attempted to use the ROW_NUMBER() over (partition by UID,Avail ORDER BY UID,Hour) to assign a number to each row partitioned by the UID and Whether or not they are flagged as available. However this does not work as the periods of availability may change multiple times a day and the ROW_NUMBER() function was only keeping two counts per user based on the Avail flag.
Approach: Using DISTINCT and WHERE clause [Accepted] Then we can select any Num column from the above table to get the target data. However, we need to add a keyword DISTINCT because it will display a duplicated number if one number appears more than 3 times consecutively.
Numbers that follow each other continuously in the order from smallest to largest are called consecutive numbers. For example: 1, 2, 3, 4, 5, 6, and so on are consecutive numbers.
For the Tabibitosan method, assign each row an increasing number, ordered by the run date. When you subtract this from the run_date, consecutive rows have the same result. 9 rows selected. You can then group by this calculated value to get summary statistics.
The common table expression (CTE) is a powerful construct in SQL that helps simplify a query. CTEs work as virtual tables (with records and columns), created during the execution of a query, used by the query, and eliminated after query execution.
If you're using SQL Server 2012+ you could using a windowed SUM, but you have to specify the number of rows in the window frame in advance as it won't accept variables so it's not that flexible:
;with cte as
(
select distinct
UID,
SUM(avail) over (partition by uid
order by hour
rows between current row and 2 following
) count
from table1
)
select uid from cte where count = 3;
If you want flexibility you could make it a stored procedure and use dynamic SQL to build and execute the statement, something like this:
create procedure testproc (@n int) as
declare @sql nvarchar(max)
set @sql = concat('
;with cte as
(
select distinct
UID,
SUM(avail) over (partition by uid
order by hour
rows between current row and ', @n - 1 , ' following
) count
from table1
)
select uid from cte where count = ' , @n , ';')
exec sp_executesql @sql
and execute it using execute testproc 3
An even more inflexible solution is to use correlated subqueries, but then you have to add another subquery for each added count:
select distinct uid
from Table1 t1
where Avail = 1
and exists (select 1 from Table1 where Avail = 1 and UID = t1.UID and Hour = t1.Hour + 1)
and exists (select 1 from Table1 where Avail = 1 and UID = t1.UID and Hour = t1.Hour + 2);
And yet another way, using row_number to find islands and then filtering by sum of avail for each island:
;with c as (
select
uid, avail,
row_number() over (partition by uid order by hour)
- row_number() over (partition by uid, avail order by hour) grp
from table1
)
select uid from c
group by uid, grp
having sum(avail) >= 3
This works... It does a self join on userID and anything in 2nd table with in @n (3hr) then returns only those records having a count of 3 records.
SELECT A.UID
FROM foo A
INNER JOIN foo B
on A.UId = B.UID
and A.Hour+3 <= B.Hour
and A.Avail= 1 and B.Avail=1
GROUP BY A.UID
HAVING count(distinct B.hour) =3
http://sqlfiddle.com/#!6/f97ee
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With