Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Database with a table containing 700 million records [duplicate]

Possible Duplicate:
What are the performance characteristics of sqlite with very large database files?

I want to create a .Net application that uses a database that will contain around 700 million records in one of its tables. I wonder if the performance of SQLite would satisfy this scenario or should I use SQL Server. I like the portability that SQLite gives me.

like image 644
Alireza Noori Avatar asked Oct 28 '25 04:10

Alireza Noori


2 Answers

Go for SQL Server for sure. 700 million records in SQLite is too much.

With SQLite you have following limitation

  • Single process write.
  • No mirroring
  • No replication

Check out this thread: What are the performance characteristics of sqlite with very large database files?

like image 182
Habib Avatar answered Oct 29 '25 18:10

Habib


700m is a lot.

To give you an idea. Let's say your record size was 4 bytes (essentially storing a single value), then your DB is going to be over 2GB. If your record size is something closer to 100 bytes then it's closer to 65GB... (that's not including space used by indexes, and transaction log files, etc).

We do a lot of work with large databases and I'd never consider SQLLite for anything of that size. Quite frankly, "Portability" is the least of your concerns here. In order to query a DB of that size with any sort of responsiveness you will need an appropriately sized database server. I'd start with 32GB of RAM and fast drives.

If it's write heavy 90%+, you might get away with smaller RAM. If it's read heavy then you will want to try and build it out so that the machine can load as much of the DB (or at least indexes) in RAM as possible. Otherwise you'll be dependent on disk spindle speeds.

like image 35
NotMe Avatar answered Oct 29 '25 19:10

NotMe