I have a table with 3 columns. Each has a unique index.
I'd like to do multiple inserts at once (300 records a pop). When a duplicate entry occurs it cancels the insert in its entirety. This means if 1 out of the 300 is a duplicate, none of them will be inserted.
Is there a way around this?
Try changing your query from INSERT INTO ...
to INSERT IGNORE INTO ...
. This will cause any errors to become warnings, and your other records should be inserted.
If your inserts are idempotent, update or replace
would help you out.
Given they are likely not, there isn't any super-efficient way to do this without something falling back to inserting individual rows as a fallback - to isolate the problem row.
If you are batching inserts to reduce latency from client to server, consider using a stored procedure to take the rows, and does insert on the server side taking all the data in a shot; that can have a fallback that does appropriate error handling on a row-by-row basis.
That assumes, of course, that there is some meaningful error handling that can be done on the server side without needing synchronous communication to the client.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With