I have a .sql file containing thousands of individual insert statements. It takes forever to do them all. I am trying to figure out a way to do this more efficiently. In python the sqlite3 library can't do things like ".read" or ".import" but executescript is too slow for that many inserts.
I installed the sqlite3.exe shell in hopes of using ".read" or ".import" but I can't quite figure out how to use it. Running it through django in eclipse doesn't work because it expects the database to be at the root of my C drive which seems silly. Running it through the command line doesn't work because it can't find my database file (unless I'm doing something wrong)
Any tips?
Thanks!
Are you using transactions ? SQLite will create a transaction for every insert statement individually by default, which slows things way down.
By default, the sqlite3 module opens transactions implicitly before a Data Modification Language (DML) statement (i.e. INSERT/UPDATE/DELETE/REPLACE)
If you manually create one single transaction at the start and commit it at the end instead, it will speed things up a lot.
Did you try running the inserts inside a single transaction? If not then each insert is treated as a transaction and .. well, you can read the SQLite FAQ for this here
In addition to running the queries in bulk inside a single transaction, also try VACUUM
and ANALYZE
ing the database file. It helped a similar problem of mine.
Use a parameterized query
and
Use a transaction.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With