Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Spark: converting GMT time stamps to Eastern taking daylight savings into account

I'm trying to convert a column of GMT timestamp strings into a column of timestamps in Eastern timezone. I want to take daylight savings into account.

My column of timestamp strings look like this:

'2017-02-01T10:15:21+00:00'

I figured out how to convert the string column into a timestamp in EST:

from pyspark.sql import functions as F

df2 = df1.withColumn('datetimeGMT', df1.myTimeColumnInGMT.cast('timestamp'))
df3 = df2.withColumn('datetimeEST', F.from_utc_timestamp(df2.datetimeGMT, "EST"))

But the times don't change with daylight savings. Is there another function or something that accounts for daylight savings with converting the timestamps?

EDIT: I think I figured it out. In the from_utc_timestamp call above, I needed to use "America/New_York" instead of "EST":

df3 = df2.withColumn('datetimeET', F.from_utc_timestamp(df2.datetimeGMT, "America/New_York"))
like image 550
Bob Swain Avatar asked Aug 18 '17 19:08

Bob Swain


People also ask

How do I convert UTC to EST in spark SQL?

In Spark SQL, function from_utc_timestamp(timestamp, timezone) converts UTC timestamp to a timestamp in the given time zone; function to_utc_timestamp(timestamp, timezone) converts timestamp in a given time zone to UTC timestamp.

What is timestamp in spark?

The timestamp type represents a time instant in microsecond precision. Valid range is [0001-01-01T00:00:00.000000Z, 9999-12-31T23:59:59.999999Z] where the left/right-bound is a date and time of the proleptic Gregorian calendar in UTC+00:00. Please use the singleton DataTypes. TimestampType to refer the type.


1 Answers

I ended up figuring out the answer, so I figured I would add it here. I also think that this question/answer is worthwhile because while I was searching for this issue before posting the question, I couldn't find anything about daylight savings for spark. I probably should have realized that I should search for the underlying java functions.

The answer to the question ended up being to use the string "America/New_York" instead of "EST". This correctly applies daylight savings.

from pyspark.sql import functions as F
df3 = df2.withColumn('datetimeET', F.from_utc_timestamp(df2.datetimeGMT, "America/New_York"))

EDIT:

This link shows a list of available time zone strings that can be used in this way: https://garygregory.wordpress.com/2013/06/18/what-are-the-java-timezone-ids/

like image 109
Bob Swain Avatar answered Oct 25 '22 04:10

Bob Swain