Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Python and sqlite3 - adding thousands of rows

I have a .sql file containing thousands of individual insert statements. It takes forever to do them all. I am trying to figure out a way to do this more efficiently. In python the sqlite3 library can't do things like ".read" or ".import" but executescript is too slow for that many inserts.

I installed the sqlite3.exe shell in hopes of using ".read" or ".import" but I can't quite figure out how to use it. Running it through django in eclipse doesn't work because it expects the database to be at the root of my C drive which seems silly. Running it through the command line doesn't work because it can't find my database file (unless I'm doing something wrong)

Any tips?

Thanks!

like image 900
JPC Avatar asked Jan 18 '11 02:01

JPC


4 Answers

Are you using transactions ? SQLite will create a transaction for every insert statement individually by default, which slows things way down.

By default, the sqlite3 module opens transactions implicitly before a Data Modification Language (DML) statement (i.e. INSERT/UPDATE/DELETE/REPLACE)

If you manually create one single transaction at the start and commit it at the end instead, it will speed things up a lot.

like image 88
Michael Low Avatar answered Oct 15 '22 01:10

Michael Low


Did you try running the inserts inside a single transaction? If not then each insert is treated as a transaction and .. well, you can read the SQLite FAQ for this here

like image 24
Spaceghost Avatar answered Oct 15 '22 00:10

Spaceghost


In addition to running the queries in bulk inside a single transaction, also try VACUUM and ANALYZEing the database file. It helped a similar problem of mine.

like image 20
zaadeh Avatar answered Oct 15 '22 00:10

zaadeh


Use a parameterized query

and

Use a transaction.

like image 25
CashCow Avatar answered Oct 15 '22 01:10

CashCow