Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Will SQLite performance degrade if the database size is greater than 2 gigabytes?

Last year when I checked about SQLite on their web site, the recommended SQLite database size was 2 gigabytes. But now, I could not find that recommendation again.

So has anyone tried to work with an SQLite database that is bigger than 2 gigabytes using latest version of it? How well did SQLite perform?

P.S: I would like to make a mobile application that requires big database (for example storing Wikipedia articles) that works locally.

like image 261
Enkhbat Avatar asked Jan 22 '13 04:01

Enkhbat


People also ask

Is SQLite good for large scale?

Very large datasetsAn SQLite database is limited in size to 281 terabytes (248 bytes, 256 tibibytes). And even if it could handle larger databases, SQLite stores the entire database in a single disk file and many filesystems limit the maximum size of files to something less than this.

Why is SQLite so slow?

The SQLite docs explains why this is so slow: Transaction speed is limited by disk drive speed because (by default) SQLite actually waits until the data really is safely stored on the disk surface before the transaction is complete. That way, if you suddenly lose power or if your OS crashes, your data is still safe.

What is considered a large database?

Techopedia Explains Very Large Database (VLDB) VLDB is primarily an enterprise class database. Although there is no specific limitation of a VLDB, it can consist of billions of records and have a cumulative size in thousands of gigabytes, or even some hundred terabytes.


2 Answers

There is no 2 GB limit.

SQLite database files have a maximum size of about 140 TB.

On a phone, the size of the storage (a few GB) will limit your database file size, while the memory size will limit how much data you can retrieve from a query. Furthermore, Android cursors have a limit of 1 MB for the results.


The database size will, by itself, not affect your performance. Your queries will be fast as long as they do not access more data than fits into the DB's page cache (2 MB by default).

like image 77
CL. Avatar answered Sep 20 '22 12:09

CL.


Usually the larger the database the more data you have in it. The more data you have, the longer searches may take. They don't have to, it depends on a search.

As for inserts, they may take longer if you have many indexes on a table. Rebuilding an index may take some time, so expect insert speed degradation with the amount of data.

Updates may also be slower - fitting rows must be found first (search), then values have to be changed (may trigger an index rebuild).

I am telling you this from experience: if you expect a lot of data in your database, consider splitting it into multiple databases. This works if your data is gathered daily and you can create a database for each day. May make your search code more complex, but will speed things up for limited searches/

like image 25
Dariusz Avatar answered Sep 20 '22 12:09

Dariusz