I'm learning how to write a pandas dataFrame to SQLite db.
I went in one example code:
import pandas as pd
import pandas.io.sql as pd_sql
import sqlite3 as sql
con = sql.connect("/home/msalese/Documents/ipyNotebooks/tmp.db")
df =pd.DataFrame({'TestData':[1,2,3,4,5,6,7,8,9]})
pd_sql.write_frame(df, "tbldata2", con)
But above code rise an exception:
---------------------------------------------------------------------------
InterfaceError Traceback (most recent call last)
<ipython-input-31-c844f7e3f2e6> in <module>()
----> 1 pd_sql.write_frame(df, "tbldata2", con)
/opt/epdFree7.3.2/lib/python2.7/site-packages/pandas-0.10.1-py2.7-linux-x86_64.egg/pandas/io/sql.pyc in write_frame(frame, name, con, flavor, if_exists, **kwargs)
208 if func is None:
209 raise NotImplementedError
--> 210 func(frame, name, safe_names, cur)
211 cur.close()
212 con.commit()
/opt/epdFree7.3.2/lib/python2.7/site-packages/pandas-0.10.1-py2.7-linux-x86_64.egg/pandas/io/sql.pyc in _write_sqlite(frame, table, names, cur)
219 table, col_names, wildcards)
220 data = [tuple(x) for x in frame.values]
--> 221 cur.executemany(insert_query, data)
222
223 def _write_mysql(frame, table, names, cur):
InterfaceError: Error binding parameter 0 - probably unsupported type.
I think that the problem is on code line 220. If I try :
[tuple(x) for x in df.values]
the result is:
[(1,), (2,), (3,), (4,), (5,), (6,), (7,), (8,), (9,)]
may be commas give noise to sqlite db.
I'm not sure, can someone give me an hint, please ?
sqlite3 provides a SQL-like interface to read, query, and write SQL databases from Python. sqlite3 can be used with Pandas to read SQL data to the familiar Pandas DataFrame. Pandas and sqlite3 can also be used to transfer between the CSV and SQL formats.
Refer to the answer from @unutbu in the comments.
The problem is avoided if you specify a float dtype. For example,
df = pd.DataFrame({'TestData': [1, 2, 3, 4, 5, 6, 7, 8, 9]}, dtype='float')
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With