I have a table named user_data
, the column id
and user_id
as the unique key. I want to import some history data to this table. I use bulk_insert_mappings method to batch insert data. But there are errors as below:
IntegrityError: (pymysql.err.IntegrityError) (1062, u"Duplicate entry '1-1234' for key 'idx_on_id_and_user_id'")
How to ignore this error and discard duplicate data when batch insert?
Dealing with duplicate primary keys on insert in SQLAlchemy (declarative style) You should handle every error. But if you really want to just ignore all errors, you can't really do a bulk insert. Sometimes there will be integrity errors in the actual data you are importing. You have to insert one by one and ignore.
This page is part of the SQLAlchemy 1.4 / 2.0 Tutorial. When using Core, a SQL INSERT statement is generated using the insert () function - this function generates a new instance of Insert which represents an INSERT statement in SQL, that adds new data into a table.
The rendered UPDATE statement will emit the SET clause for each referenced column maintaining this order. Deprecated since version 1.4: The update.preserve_parameter_orderparameter will be removed in SQLAlchemy 2.0. Use the Update.ordered_values()method with a list of tuples. New in version 1.0.10. See also
In its simple form above, the INSERT statement does not return any rows, and if only a single row is inserted, it will usually include the ability to return information about column-level default values that were generated during the INSERT of that row, most commonly an integer primary key value.
You should handle every error. But if you really want to just ignore all errors, you can't really do a bulk insert. Sometimes there will be integrity errors in the actual data you are importing. You have to insert one by one and ignore. I would only use this in once off scripts.
for item in dict_list:
try:
session.merge(orm(**item))
session.commit()
except Exception as e:
session.rollback()
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With