Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Pyspark - Window Functions Range Between Date Offset

Tags:

pyspark

Given the example data below, I would like to, for each row, count the number of rows the same 'var1' value was seen within the last 3 days.

_schema = StructType([StructField("date", StringType(), True),
                               StructField("var1", IntegerType(), True),
                         StructField("var2", StringType(), True)])


test_list = [('2017-01-30',123,'A'),
('2017-01-17',123,'B'),
('2017-01-15',123,'A'),
('2017-01-15',123,'A'),
('2017-01-14',123,'A'),
('2017-01-11',123,'B'),
('2017-01-29',456,'A'),
('2017-01-22',789,'B'),
('2017-01-21',789,'B'),
('2017-01-20',789,'A'),
('2017-01-19',789,'A')

]

df = sqlContext.createDataFrame(test_list,schema=_schema) 
df=(df.withColumn('date',df.date.cast(DateType())))

I am not sure how to set rangeBetween to say include only rows where the var1 (e.g. 123) is present and date is 3 days prior, not including the current date.

wSpec1=Window.partitionBy('var1').orderBy('date').rangeBetween(-3,-1)

df.withColumn("events_past_3days",F.count(df.var2).over(wSpec1))

This gives me an error that is beyond my experience:

AnalysisException: u'Window specification windowspecdefinition(var1#368, date#374 ASC, RANGE BETWEEN 3 PRECEDING AND 1 PRECEDING) is not valid because The data type of the expression in the ORDER BY clause should be a numeric type.;;\nProject [date#374, var1#368, var2#369, dayssinceJan11900#379, events_past_3days#856L]\n+- Project [date#374, var1#368, var2#369, dayssinceJan11900#379, events_past_3days#856L, events_past_3days#856L]\n   +- Window [count(var2#369) windowspecdefinition(var1#368, date#374 ASC, RANGE BETWEEN 3 PRECEDING AND 1 PRECEDING) AS events_past_3days#856L], [var1#368], [date#374 ASC]\n      +- Project [date#374, var1#368, var2#369, dayssinceJan11900#379]\n         +- Project [date#374, var1#368, var2#369, dayssinceJan11900#379, events_past_3days#641L]\n            +- Project [date#374, var1#368, var2#369, dayssinceJan11900#379, events_past_3days#641L, events_past_3days#641L]\n               +- Window [count(var2#369) windowspecdefinition(var1#368, dayssinceJan11900#379 ASC, RANGE BETWEEN 3 PRECEDING AND 1 PRECEDING) AS events_past_3days#641L], [var1#368], [dayssinceJan11900#379 ASC]\n                  +- Project [date#374, var1#368, var2#369, dayssinceJan11900#379]\n                     +- Project [date#374, var1#368, var2#369, dayssinceJan11900#379, events_past_3days#424L]\n                        +- Project [date#374, var1#368, var2#369, dayssinceJan11900#379, events_past_3days#424L, events_past_3days#424L]\n                           +- Window [count(var2#369) windowspecdefinition(var1#368, dayssinceJan11900#379 ASC, RANGE BETWEEN 3 PRECEDING AND 1 PRECEDING) AS events_past_3days#424L], [var1#368], [dayssinceJan11900#379 ASC]\n                              +- Project [date#374, var1#368, var2#369, dayssinceJan11900#379]\n                                 +- Project [date#374, var1#368, var2#369, datediff(date#374, -25567) AS dayssinceJan11900#379]\n                                    +- Project [cast(date#367 as date) AS date#374, var1#368, var2#369]\n                                       +- LogicalRDD [date#367, var1#368, var2#369]\n'
like image 545
B_Miner Avatar asked Mar 06 '23 15:03

B_Miner


1 Answers

I found one solution is to create a date offset and use that numeric in the rangeBetween. I wonder if anyone has any other methods?

#add this to have a numeric to use below 
df=(df.withColumn('dayssinceJan11900',datediff(df.date,F.lit(date(1900, 1, 1)))))

wSpec1=Window.partitionBy('var1').orderBy('dayssinceJan11900').rangeBetween(-3,-1)
like image 174
B_Miner Avatar answered May 24 '23 12:05

B_Miner