I have three columns about year, month and day. How can I use these to create date in PySpark?
You can use concat_ws()
to concat columns with -
and cast to date.
#sampledata
df.show()
#+----+-----+---+
#|year|month|day|
#+----+-----+---+
#|2020| 12| 12|
#+----+-----+---+
from pyspark.sql.functions import *
df.withColumn("date",concat_ws("-",col("year"),col("month"),col("day")).cast("date")).show()
+----+-----+---+----------+
|year|month|day| date|
+----+-----+---+----------+
|2020| 12| 12|2020-12-12|
+----+-----+---+----------+
#dynamic way
cols=["year","month","day"]
df.withColumn("date",concat_ws("-",*cols).cast("date")).show()
#+----+-----+---+----------+
#|year|month|day| date|
#+----+-----+---+----------+
#|2020| 12| 12|2020-12-12|
#+----+-----+---+----------+
#using date_format,to_timestamp,from_unixtime(unix_timestamp) functions
df.withColumn("date",date_format(concat_ws("-",*cols),"yyyy-MM-dd").cast("date")).show()
df.withColumn("date",to_timestamp(concat_ws("-",*cols),"yyyy-MM-dd").cast("date")).show()
df.withColumn("date",to_date(concat_ws("-",*cols),"yyyy-MM-dd")).show()
df.withColumn("date",from_unixtime(unix_timestamp(concat_ws("-",*cols),"yyyy-MM-dd"),"yyyy-MM-dd").cast("date")).show()
#+----+-----+---+----------+
#|year|month|day| date|
#+----+-----+---+----------+
#|2020| 12| 12|2020-12-12|
#+----+-----+---+----------+
For Spark 3+, you can use make_date
function:
df = df.withColumn("date", expr("make_date(year, month, day)"))
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With