I have a spark Time Series data frame. I would like to split it into 80-20 (train-test). As this is a time series data frame, I don't want to do a random split. How do I do this in order to pass the first data frame into train and the second to test?
The PySpark SQL provides the split() function to convert delimiter separated String to an Array (StringType to ArrayType) column on DataFrame It can be done by splitting the string column on the delimiter like space, comma, pipe, etc. and converting it into ArrayType.
Train/test splits in time series For example, if you had 144 records at monthly intervals (12 years), a good approach would be to keep the first 120 records (10 years) for training and the last 24 records (2 years) for testing. And that's all there is to train/test splits.
You can use pyspark.sql.functions.percent_rank()
to get the percentile ranking of your DataFrame ordered by the timestamp/date column. Then pick all the columns with a rank <= 0.8
as your training set and the rest as your test set.
For example, if you had the following DataFrame:
df.show(truncate=False)
#+---------------------+---+
#|date |x |
#+---------------------+---+
#|2018-01-01 00:00:00.0|0 |
#|2018-01-02 00:00:00.0|1 |
#|2018-01-03 00:00:00.0|2 |
#|2018-01-04 00:00:00.0|3 |
#|2018-01-05 00:00:00.0|4 |
#+---------------------+---+
You'd want the first 4 rows in your training set and the last one in your training set. First add a column rank
:
from pyspark.sql.functions import percent_rank
from pyspark.sql import Window
df = df.withColumn("rank", percent_rank().over(Window.partitionBy().orderBy("date")))
Now use rank
to split your data into train
and test
:
train_df = df.where("rank <= .8").drop("rank")
train_df.show()
#+---------------------+---+
#|date |x |
#+---------------------+---+
#|2018-01-01 00:00:00.0|0 |
#|2018-01-02 00:00:00.0|1 |
#|2018-01-03 00:00:00.0|2 |
#|2018-01-04 00:00:00.0|3 |
#+---------------------+---+
test_df = df.where("rank > .8").drop("rank")
test_df.show()
#+---------------------+---+
#|date |x |
#+---------------------+---+
#|2018-01-05 00:00:00.0|4 |
#+---------------------+---+
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With