I'm working on migration function. It reads data from old table and inserts it into the new one. All that stuff working in background thread with low priority.
My steps in pseudo code.
sqlite3_prepare_stmt (select statement)
sqlite3_prepare_stmt (insert statement)
while (sqlite3_step (select statement) == SQLITE_ROW)
{
get data from select row results
sqlite3_bind select results to insert statement
sqlite3_step (insert statement)
sqlite3_reset (insert statement)
}
sqlite3_reset (select statement)
I'm always getting 'constraint failed' error on sqlite3_step (insert statement)
. Why it's happend and how i could fix that?
UPD: As i'm understand that's happend because background thread use db handle opened in main thread. Checking that guess now.
UPD2:
sqlite> select sql from sqlite_master where tbl_name = 'tiles';
CREATE TABLE tiles('pk' INTEGER PRIMARY KEY, 'data' BLOB, 'x' INTEGER, 'y' INTEGER, 'z' INTEGER, 'importKey' INTEGER)
sqlite> select sql from sqlite_master where tbl_name = 'tiles_v2';
CREATE TABLE tiles_v2 (pk int primary key, x int, y int, z int, layer int, data blob, timestamp real)
You get a UNIQUE constraint failed error when the data that you are inserting has an entry which is already in the corresponding column of the table that you are inserting into.
insert or ignore ... will insert the row(s) and ignore rows which violation any constraint (other than foreign key constraints).
The error message itself showing there is a foreign key constraint error, which means you are deleting a parent table where the child table contains the Primary table identifier as a foreign key. To avoid this error, you need to delete child table records first and after that the parent table record.
SQLite allows you to define a CHECK constraint at the column level or the table level. In this syntax, whenever a row is inserted into a table or an existing row is updated, the expression associated with each CHECK constraint is evaluated and returned a numeric value 0 or 1.
It probably means your insert statement is violating a constraint in the new table. Could be a primary key constraint, a unique constraint, a foreign key constraint (if you're using PRAGMA foreign_keys = ON;
), and so on.
You fix that either by dropping the constraint, correcting the data, or dropping the data. Dropping the constraint is usually a Bad Thing, but that depends on the application.
Is there a compelling reason to copy data one row at a time instead of as a set?
INSERT INTO new_table
SELECT column_list FROM old_table;
If you need help identifying the constraint, edit your original question, and post the output of these two SQLite queries.
select sql from sqlite_master where tbl_name = 'old_table_name';
select sql from sqlite_master where tbl_name = 'new_table_name';
Update: Based on the output of those two queries, I see only one constraint--the primary key constraint in each table. If you haven't built any triggers on these tables, the only constraint that can fail is the primary key constraint. And the only way that constraint can fail is if you try to insert two rows that have the same value for 'pk'.
I suppose that could happen in a few different ways.
You can determine whether there are duplicate values of 'pk' in the old table by running this query.
select pk
from old_table_name
group by pk
having count() > 1;
You might consider trying to manually migrate the data using INSERT INTO . . . SELECT . . .
If that fails, add a WHERE clause to reduce the size of the set until you isolate the bad data.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With