Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to escape single quotes in Unload

    conn_string = "dbname='{}' port='{}' user='{}' password='{}' host='{}'"\
            .format(dbname,port,user,password,host_url) 

    sql="""UNLOAD ('select col1,col2 from %s.visitation_hourly_summary_us where col4= '2018-07-10' and col5= '1';') TO 's3://%s/%s/%s.csv' \
            credentials 'aws_access_key_id=%s;aws_secret_access_key=%s' \
            MANIFEST GZIP ALLOWOVERWRITE;Commit;""" \
            % (schema_name,s3_bucket_name, schema,table,aws_access_key_id,\
            aws_secret_access_key)

con = psycopg2.connect(conn_string)
cur = con.cursor()
cur.execute(sql)

I'm trying to execute the above script to read the table and then create a file in S3

Since my columns are string I'm not able to skip the single quotes and I'm getting error as syntax error near where

Also, I've tried giving \ in where condition still it showing the same error.

Any help would be highly appreciated.

Thanks

like image 264
Mukesh Marimuthu Avatar asked Sep 25 '18 12:09

Mukesh Marimuthu


3 Answers

You can also use postgres style :

unload 
($$
select * from table where id='ABC'
$$)
to 's3://bucket/queries_results/20150324/table_dump/'
credentials 'aws_access_key_id=;aws_secret_access_key='
;
like image 117
Fact Avatar answered Oct 24 '22 11:10

Fact


You would want to use two single quotes to enclose the value.

If your query contains quotes (for example to enclose literal values), put the literal between two sets of single quotation marks—you must also enclose the query between single quotation marks:

Example:

UNLOAD ('select * from venue where venuestate=''NV''')

Taken from the redshift documentation: https://docs.aws.amazon.com/redshift/latest/dg/r_UNLOAD.html

like image 7
krchun Avatar answered Oct 24 '22 11:10

krchun


As Sarang says, simply by replacing single quotes by double quotes in col4 and col5 values of your query should do the trick.

However I would suggest you to break your string down in smaller chunks easier to read and maintain. This way, you should be able to use execute as chepner suggests (and MySQL documentation):

# Create the inner SQL statement. Notice the single quotes for the general
# string and the double quotes for the col4 and col5 values
sql_stmt = ('SELECT col1, col2 '
            'FROM %s.visitation_hourly_summary_us '
            'WHERE col4 = "2018-07-10" AND col5= "1";' % schema_name)

# Format the s3 path
s3_target = 's3://%s/%s/%s.csv' % (s3_bucket_name, schema, table)

# Format credentials string
s3_credentials = 'aws_access_key_id=%s;aws_secret_access_key=%s' % (
    aws_access_key_id, aws_secret_access_key)

# Create a tuple with all preformatted strings
data = (sql_stmt, s3_target, s3_credentials)

# Format the s3 query skeleton
s3_stmt = ("UNLOAD ('%s') TO '%s' "
           "CREDENTIALS '%s' "
           "MANIFEST GZIP ALLOWOVERWRITE;Commit;")

con = psycopg2.connect(conn_string)
cur = con.cursor()
cur.execute(s3_stmt, data)
like image 4
bvidalar Avatar answered Oct 24 '22 09:10

bvidalar