I'm running a query in which I'm generating a table variable with inserts. That means that I can make the query very large with a large number of inserts. When I execute this query via a sqlalchemy session, it runs as intended so long as the number of inserts, or as I suspect the length of the query, is small enough. When it is too large, nothing happens. Does this make sense? Is there a configurable query limit that I can set? Or do I have to split my query up into pieces?
Thanks, PiR
You have probably hit the maximum length of allowed packet for your dbms. In case you were using MySQL in the backend, I suggest that you set the max_allowed_packet
to a higher value like 512000000 for example. Or research a similar solution in case you use another dbms.
For MySQL I use:
engine.execute('SET GLOBAL max_allowed_packet=512000000;')
to set this value globally until the server reboots. You can add it to your my.ini
/my.cnf
if you have enough permissions to do so.
I am not aware of a method to do this in a dbms-agnostic manner from SQLAlchemy.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With