I want to store a timezone-aware column in postgresql using pandas to_sql.
When the times are not timezone-aware, it works
times = ['201510100222', '201510110333']
df = pd.DataFrame()
df['time'] = pd.to_datetime(times)
df.time.to_sql('test', engine, if_exists='replace', index=False)
But when I specify UTC
times = ['201510100222', '201510110333']
df = pd.DataFrame()
df['time'] = pd.to_datetime(times, utc=True)
df.time.to_sql('test', engine, if_exists='replace', index=False)
I have the following error:
ValueError: Cannot cast DatetimeIndex to dtype datetime64[us]
I'm using python 3.4.3, postgresql 9.4, pandas 0.17.1, sqlalchemy 1.0.5
You have to store it as pd.Timestamp in PostgreSQL. The code below worked for me:
times = ['201510100222', '201510110333']
df = pd.DataFrame()
df['time'] = pd.to_datetime(times, utc=True)
df['time'] = df['time'].astype(pd.Timestamp)
df.time.to_sql('test', engine, if_exists='replace', index=False)
But don't forget to properly create your database table with data type TIMESTAMP WITH TIME ZONE. If you are building your table directly from the to_sql command, you have to specify it explicitly:
from sqlalchemy.types import TIMESTAMP as typeTIMESTAMP
df.time.to_sql('test', engine, if_exists='replace', index=False,dtype=typeTIMESTAMP(timezone=True))
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With