Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Pyspark: Extract date from Datetime value

Tags:

I am trying to figure out, how to extract a date from a datetime value using Pyspark sql.

The datetime values look like this:

DateTime
2018-05-21T00:00:00.000-04:00
2016-02-22T02:00:02.234-06:00

When I now load this into a spark dataframe and try to extract the date (via

Date() or
Timestamp() and then Date()

I always get the error, that a date or timestamp value is expected, but a DateTime value was provided.

Can someone help me with retrieving the date from this value? I think, you need to provide a timezone for that - but since I already had problems extracting only the date, I first wanted to solve this.

Thank you and kind regards.

like image 564
bublitz Avatar asked Aug 16 '18 15:08

bublitz


People also ask

How do you convert PySpark to date?

The to_date() function in Apache PySpark is popularly used to convert Timestamp to the date. This is mostly achieved by truncating the Timestamp column's time part. The to_date() function takes TimeStamp as it's input in the default format of "MM-dd-yyyy HH:mm:ss. SSS".

How do you change date from one format to another in PySpark?

In PySpark use date_format() function to convert the DataFrame column from Date to String format.


2 Answers

Pyspark has a to_date function to extract the date from a timestamp. In your example you could create a new column with just the date by doing the following:

df = df.withColumn("date_only", func.to_date(func.col("DateTime")))

If the column you are trying to convert is a string you can set the format parameter of to_date specifying the datetime format of the string.

You can read more about to_date in the documentation here.

like image 58
vielkind Avatar answered Sep 18 '22 18:09

vielkind


You can use either date_format (or) from_unixtime (or) to_date functions to extract date from the input string.

Example:

Input data df data as follows..

#sample dataframe
df=spark.createDataFrame([('2018-05-21T00:00:00.000-04:00',),('2016-02-22T02:00:02.234-06:00',)],['ts'])

#set UTC timestamp
spark.sql("set spark.sql.session.timeZone=UTC")

df.show(10,False)
#+-----------------------------+
#|ts                           |
#+-----------------------------+
#|2018-05-21T00:00:00.000-04:00|
#|2016-02-22T02:00:02.234-06:00|
#+-----------------------------+

1. Using date_format() function:

from pyspark.sql.functions import *
df.select(date_format(col('ts'),"yyyy-MM-dd").alias('ts').cast("date")).show(10,False)
#+----------+
#|ts        |
#+----------+
#|2018-05-21|
#|2016-02-22|
#+----------+

2. Using to_date() function:

df.select(to_date(col('ts')).alias('ts').cast("date")).show(10,False)
#+----------+
#|ts        |
#+----------+
#|2018-05-21|
#|2016-02-22|
#+----------+

3. Using from_unixtime(unix_timestamp()) functions:

df.select(from_unixtime(unix_timestamp(col('ts'),"yyyy-MM-dd'T'HH:mm:ss.SSS"),"yyyy-MM-dd").alias("ts").cast("date")).show(10,False)
#+----------+
#|ts        |
#+----------+
#|2018-05-21|
#|2016-02-22|
#+----------+
like image 38
notNull Avatar answered Sep 18 '22 18:09

notNull