I'm using sqlalchemy
in pandas to query postgres database and then insert results of a transformation to another table on the same database. But when I do
df.to_sql('db_table2', engine)
I get this error message:
ValueError: Table 'db_table2' already exists.
I noticed it want to create a new table. How to insert pandas dataframe to an already existing table ?
df = pd.read_sql_query('select * from "db_table1"',con=engine)
#do transformation then save df to db_table2
df.to_sql('db_table2', engine)
ValueError: Table 'db_table2' already exists
1. Specify both the column names and the values to be inserted: INSERT INTO table_name (column1, column2, column3, ...)
Create a dataframe by calling the pandas dataframe constructor and passing the python dict object as data. Invoke to_sql() method on the pandas dataframe instance and specify the table name and database connection. This creates a table in MySQL database server and populates it with the data from the pandas dataframe.
make use of if_exists parameter:
df.to_sql('db_table2', engine, if_exists='replace')
or
df.to_sql('db_table2', engine, if_exists='append')
from docstring:
"""
if_exists : {'fail', 'replace', 'append'}, default 'fail'
- fail: If table exists, do nothing.
- replace: If table exists, drop it, recreate it, and insert data.
- append: If table exists, insert data. Create if does not exist.
"""
Zen of Python:
Explicit is better than implicit.
df.to_sql(
name,# Name of SQL table.
con, # sqlalchemy.engine.Engine or sqlite3.Connection
schema=None, # Something can't understand yet. just keep it.
if_exists='fail', # How to behave if the table already exists. You can use 'replace', 'append' to replace it.
index=True, # It means index of DataFrame will save. Set False to ignore the index of DataFrame.
index_label=None, # Depend on index.
chunksize=None, # Just means chunksize. If DataFrame is big will need this parameter.
dtype=None, # Set the columns type of sql table.
method=None, # Unstable. Ignore it.
)
So, I recommend this example, normally:
df.to_sql(con=engine, name='table_name',if_exists='append', dtype={
'Column1': String(255),
'Column2': FLOAT,
'Column3': INT,
'createTime': DATETIME},index=False)
Set the sql table Primary Key manually(like: Id) and check increment in Navicat or MySQL Workbench.
The Id will increment automatically.
The Docstring of df.to_sql:
Parameters
----------
name : string
Name of SQL table.
con : sqlalchemy.engine.Engine or sqlite3.Connection
Using SQLAlchemy makes it possible to use any DB supported by that
library. Legacy support is provided for sqlite3.Connection objects.
schema : string, optional
Specify the schema (if database flavor supports this). If None, use
default schema.
if_exists : {'fail', 'replace', 'append'}, default 'fail'
How to behave if the table already exists.
* fail: Raise a ValueError.
* replace: Drop the table before inserting new values.
* append: Insert new values to the existing table.
index : bool, default True
Write DataFrame index as a column. Uses `index_label` as the column
name in the table.
index_label : string or sequence, default None
Column label for index column(s). If None is given (default) and
`index` is True, then the index names are used.
A sequence should be given if the DataFrame uses MultiIndex.
chunksize : int, optional
Rows will be written in batches of this size at a time. By default,
all rows will be written at once.
dtype : dict, optional
Specifying the datatype for columns. The keys should be the column
names and the values should be the SQLAlchemy types or strings for
the sqlite3 legacy mode.
method : {None, 'multi', callable}, default None
Controls the SQL insertion clause used:
* None : Uses standard SQL ``INSERT`` clause (one per row).
* 'multi': Pass multiple values in a single ``INSERT`` clause.
* callable with signature ``(pd_table, conn, keys, data_iter)``.
Details and a sample callable implementation can be found in the
section :ref:`insert method <io.sql.method>`.
.. versionadded:: 0.24.0
That's all.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With