I am trying to query data from a postgresql db and insert it into an sqlite db.
Here is my code:
import pandas as pd
import pandas.io.sql as pd_sql
import sqlite3 as sql3
import psycopg2
#Aquire Data FROM PostgreSQL DB
conn_pg = psycopg2.connect("dbname='xx' user='xxxxx' host=xxx.xxx.xx.xxx password='xxxx'");
sql_1='SELECT * FROM table1 limit 5'
df_1=pd_sql.read_frame(sql_1,conn_pg)
conn_pg.close()
#Insert Into sqlite3 DB
conn_sqlite=sql3.connect('/xxxx/xxxx/xxxx/xxxx/my_db.db')
pd_sql.write_frame(df_1,'table1',conn_sqlite,'sqlite',if_exists='replace')
conn_sqlite.close()
df_1 has dtypes:
field1 object
field2 datetime64[ns]
field3 float64
field4 object
dtype: object
I am getting an error:
InterfaceError: Error binding parameter 1 - probably unsupported type.
on:
pd_sql.write_frame(df_1,'table1',conn_sqlite,'sqlite',if_exists='replace')
I am guessing sqlite does not like the datetime64 of field2. I need help to figure out:
1. Which date type I should convert field2 to in my dataframe and
2. How to do this in a pandas DataFrame
Any help would be much appreciated.
Cheers!
You are indeed correct that the datetime64 field is causing the troubles. Sqlite has no real datetime type, but they use text or integer types to represent times (see http://www.sqlite.org/datatype3.html and http://www.sqlite.org/lang_datefunc.html).
So dependent on what you want to do, you can first convert your datetime column to a string:
df['field2'] = df['field2'].apply(str)
or to an int (the number of seconds since 1970-01-01 00:00:00 UTC):
df['field2'] = df['field2'].astype('int64')
and then write your data to sqlite.
Sidenotes:
if_exists='replace' implementation, which is fixed in 0.13.1 (latest stable release at the moment)If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With