I am using SQLAlchemy to connect to different databases in Python, but not with the ORM support as this cannot be implemented due to several reasons.
Mainly I do build a complex SQL query using things like
sql += "AND fieldname = '%s'" % myvar
In my case is not a problem of SQL injection as the data is always from a trusted source but even if the source is trusted it could contain characters that could break the query like '
, %
or _
.
Mainly, I need to escape them, and I wonder if there is an already existing escape function that I could re-use.
Use Literal Quoting Another SQL escape single quote method you can use in Oracle is “literal quoting”. This means you can put the letter “q” in front, followed by your escape character, then square brackets. This means that any quotes inside the square brackets are not escaped.
Import necessary functions from the SQLAlchemy package. Establish connection with the PostgreSQL database using create_engine() function as shown below, create a table called books with columns book_id and book_price. Insert record into the tables using insert() and values() function as shown.
Python Flask and SQLAlchemy ORM All SELECT statements generated by SQLAlchemy ORM are constructed by Query object. It provides a generative interface, hence successive calls return a new Query object, a copy of the former with additional criteria and options associated with it.
lazy = 'dynamic': When querying with lazy = 'dynamic', however, a separate query gets generated for the related object. If you use the same query as 'select', it will return: You can see that it returns a sqlalchemy object instead of the city objects.
In cases where one must explicitly escape a string, and the standard tools don't align with the requirement, you can ask SQLAlchemy
to escape using an engine's dialect.
import sqlalchemy
engine = sqlalchemy.create_engine(...)
sqlalchemy.String('').literal_processor(dialect=engine.dialect)(value="untrusted value")
In my case, I needed to dynamically create a database (sqlalchemy-utils
has this functionality but it failed in my case) according to user input.
To extend @edd 's answer, which works in a limited capacity.
@edd provided:
import sqlalchemy
engine = sqlalchemy.create_engine(...)
sqlalchemy.String('').literal_processor(dialect=engine.dialect)(value="untrusted value")
If your "untrusted value" is a query you want to execute, this will end up a double-quoted string wrapping a single-quoted string, which you can't directly execute without stripping the quotes, i.e. "'SELECT ...'"
.
You can used sqlalchemy.Integer().literal_processor
to do the same thing, but the result will not have the extra inner quotes, because it is intended to create an integer like 5
instead of a string like '5'
. So your result will only be quoted once: "SELECT ..."
.
I found this Integer approach a little sketchy - is the person that reads my code going to know why I'm doing this? For psycopg2 at least, there is a more direct and clear approach.
If your underlying driver is psycopg2, you can use sqlalchemy to reach down into the driver, get the cursor, then use psycopg2's cursor.mogrify
to bind & escape your query
from sqlalchemy.orm import sessionmaker
Session = sessionmaker(bind=engine)
session = Session()
cursor = session.connection().connection.cursor()
processed_query = cursor.mogrify([mogrify args, see docs]).decode("UTF-8")
I got how to grab the cursor from this answer: SQLAlchemy, Psycopg2 and Postgresql COPY
And mogrify from this answer: psycopg2 equivalent of mysqldb.escape_string?
My use case was building a query, then wrapping it in parantheses and aliasing like (SELECT ...) AS temp_some_table
, in order to pass it to PySpark JDBC read
. When SQLAlchemy builds the queries, it minimizes the parentheses, and so I could only get SELECT ... AS temp_some_table
. I used the above approch to get what I need:
cursor = session.connection().connection.cursor()
aliased_query = cursor.mogrify(
f"({query}) AS temp_{model.__tablename__}"
).decode("UTF-8")
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With