Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

python sqlite - database loaded in to memory (RAM)? [closed]

Tags:

python

sqlite

ram

I am thinking about using the sqlite3 library in python to do some data storage. It's important to me to avoid loading too much data into memory - there is potentially many (10+) gigabytes of data, and I want to be able to access it in such a way that the data isn't loaded into RAM all at once. Will sqlite3 accomplish this for me? I am thinking of the following code specifically:

import sqlite3
conn = sqlite3.connect('example.db')

c = conn.cursor()

c.execute('''SELECT * FROM table1''')

var = c.fetchall()

Suppose that example.db takes up 14 gb, and table1 takes up 1 gb. How much data would be loaded into RAM?

like image 821
SheerSt Avatar asked Feb 12 '26 01:02

SheerSt


1 Answers

You almost certainly don't want to use fetchall(), that will load your entire table into RAM, plus whatever housekeeping is required, so “more than 1 GB” is the answer to your question. Use fetchone() instead.

like image 64
Emmet Avatar answered Feb 14 '26 14:02

Emmet



Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!