I am thinking about using the sqlite3 library in python to do some data storage. It's important to me to avoid loading too much data into memory - there is potentially many (10+) gigabytes of data, and I want to be able to access it in such a way that the data isn't loaded into RAM all at once. Will sqlite3 accomplish this for me? I am thinking of the following code specifically:
import sqlite3
conn = sqlite3.connect('example.db')
c = conn.cursor()
c.execute('''SELECT * FROM table1''')
var = c.fetchall()
Suppose that example.db takes up 14 gb, and table1 takes up 1 gb. How much data would be loaded into RAM?
You almost certainly don't want to use fetchall(), that will load your entire table into RAM, plus whatever housekeeping is required, so “more than 1 GB” is the answer to your question. Use fetchone() instead.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With