This python code should run statements on the database, but the sql statements are not executed:
from sqlalchemy import *
sql_file = open("test.sql","r")
sql_query = sql_file.read()
sql_file.close()
engine = create_engine(
'postgresql+psycopg2://user:password@localhost/test', echo=False)
conn = engine.connect()
print sql_query
result = conn.execute(sql_query)
conn.close()
The test.sql
file contains SQL statements which create 89 tables.
The tables are not created if I specify 89 tables, but if I reduce the number of tables to 2 to it works.
Is there a limit on the number of queries that can be executed within the conn.execute? How do a run any number of raw queries like this?
SQLAlchemy execute() return ResultProxy as Tuple, not dict.
The create_engine() method of sqlalchemy library takes in the connection URL and returns a sqlalchemy engine that references both a Dialect and a Pool, which together interpret the DBAPI's module functions as well as the behavior of the database.
Perhaps, forcing the autocommit:
conn.execute(RAW_SQL).execution_options(autocommit=True))
Other approach is using transactions and doing the commit:
t = conn.begin()
try:
conn.execute(RAW_SQL)
t.commit()
except:
t.rollback()
PD: You can put the execution_options in the create_engine parameters too.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With