Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Tips for optimizing an sqlite database with over a gig of data in it? [closed]

I am working with a larger than average sqlite database (for use on both on windows and linux) and am looking to maximize the performance I get out of it. The database is to be installed on commodity hardware along with an sqlite gui. The users I am delivering this to are sql savvy but are unlikely to undertake their own optimizations (creation of indexes, setting of pragma etc.) so I am keen to get as much out of the box performance as possible (to ensure maximum usage of the data).

One issue Windows seems to throttle the execution of queries much more than Linux and another is that I am less familiar with sqlite's approach to indexing (compared to other databases such as postgres).

like image 270
wioota Avatar asked Oct 22 '08 08:10

wioota


2 Answers

Have you read the SQLite Optimization FAQ (a few years old, but still seems useful)?

I don't think 1gb is particularly large, even for SQLite. It can certainly handle much larger databases stably.

like image 152
Dan Lenski Avatar answered Sep 30 '22 13:09

Dan Lenski


An update - we've had the most success with more intelligent ordering of data at import time, aggressive indexing (more than doubling the size of the db file in the process), tuning settings in Windows to get XP to behave more like a server than a desktop, changing the order of joins (you can't solely rely on the optimizer) and by measuring your progress by creating a performance test suite (just a harness for running queries and taking measurements.

The experience has been fun and the end users are happy and are able to work with the data.

like image 39
wioota Avatar answered Sep 30 '22 12:09

wioota