With the below example, on my machine, setting range(150)
leads to the error, while range(100)
does not:
from peewee import *
database = SqliteDatabase(None)
class Base(Model):
class Meta:
database = database
colnames = ["A", "B", "C", "D", "E", "F", "G", "H"]
cols = {x: TextField() for x in colnames}
table = type('mytable', (Base,), cols)
database.init('test.db')
database.create_tables([table])
data = []
for x in range(150):
data.append({x: 1 for x in colnames})
with database.atomic() as txn:
table.insert_many(data).upsert().execute()
Leads to:
Traceback (most recent call last):
File "<stdin>", line 2, in <module>
File "/cluster/home/ifiddes/python2.7/lib/python2.7/site-packages/peewee.py", line 3213, in execute
cursor = self._execute()
File "/cluster/home/ifiddes/python2.7/lib/python2.7/site-packages/peewee.py", line 2628, in _execute
return self.database.execute_sql(sql, params, self.require_commit)
File "/cluster/home/ifiddes/python2.7/lib/python2.7/site-packages/peewee.py", line 3461, in execute_sql
self.commit()
File "/cluster/home/ifiddes/python2.7/lib/python2.7/site-packages/peewee.py", line 3285, in __exit__
reraise(new_type, new_type(*exc_args), traceback)
File "/cluster/home/ifiddes/python2.7/lib/python2.7/site-packages/peewee.py", line 3454, in execute_sql
cursor.execute(sql, params or ())
peewee.OperationalError: too many SQL variables
This seems very low to me. I am trying to use peewee
to replace existing pandas
based SQL construction, because pandas
lacks support for a primary key. Only being able to insert ~100 records per loop is very low, and fragile if the number of columns goes up some day.
How can I make this work better? Is it possible?
After some investigation, the problem appears to be related with the maximum number of parameters that a sql query can have: SQLITE_MAX_VARIABLE_NUMBER.
To be able to do big bulk inserts I first estimate SQLITE_MAX_VARIABLE_NUMBER and then use it to create chunks in the list of dictionaries I want to insert.
To estimate the value I use this function inspired by this answer:
def max_sql_variables():
"""Get the maximum number of arguments allowed in a query by the current
sqlite3 implementation. Based on `this question
`_
Returns
-------
int
inferred SQLITE_MAX_VARIABLE_NUMBER
"""
import sqlite3
db = sqlite3.connect(':memory:')
cur = db.cursor()
cur.execute('CREATE TABLE t (test)')
low, high = 0, 100000
while (high - 1) > low:
guess = (high + low) // 2
query = 'INSERT INTO t VALUES ' + ','.join(['(?)' for _ in
range(guess)])
args = [str(i) for i in range(guess)]
try:
cur.execute(query, args)
except sqlite3.OperationalError as e:
if "too many SQL variables" in str(e):
high = guess
else:
raise
else:
low = guess
cur.close()
db.close()
return low
SQLITE_MAX_VARIABLE_NUMBER = max_sql_variables()
Then I use the above variable to slice the data
with database.atomic() as txn:
size = (SQLITE_MAX_VARIABLE_NUMBER // len(data[0])) -1
# remove one to avoid issue if peewee adds some variable
for i in range(0, len(data), size):
table.insert_many(data[i:i+size]).upsert().execute()
An update about execution speed of max_sql_variables
.
On a 3 years old Intel machine with 4 cores and 4 Gb of RAM, running OpenSUSE tumbleweed, with SQLITE_MAX_VARIABLE_NUMBER set to 999, the function runs in less that 100ms. If I set high = 1000000
, the execution time becomes of the order of 300ms.
On a younger Intel machine with 8 cores and 8Gb of RAM, running Kubuntu, with SQLITE_MAX_VARIABLE_NUMBER set to 250000, the function runs in about 2.6 seconds and returns 99999. If I set high = 1000000
, the execution time becomes of the order of 4.5 seconds.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With