I want to assign case-sensitive alias to column names in query, but PostgreSQL just ignores any case in alias and leaves it lowercase. I know that it stores everything in lowercase, but I just want case sensitive alias in query. Any way to get it working? Renaming stuff manually after query execution for huge dict sets sounds ridiculous.
Update: Important note - I'm using Amazon Redshift. Is it possible that Amazon limits it?
Example
select superid from supertable;
...
cursor.execute(query)
results = cursor.fetchall()
for row in results:
print row['superid']
// ^ WORKING
------------------------
select superid as "superId" from supertable;
...
cursor.execute(query)
results = cursor.fetchall()
for row in results:
print row['superId']
// ^ NOT WORKING
It is indeed a Redshift thing. It is very finicky about using mixed-case. While a SELECT statement using mixed case will work fine using the JDBC connector, it will ignore your attempts to use mixed case via SQL internals and perhaps the API, which is what I think psycopg uses.
I ran into a problem where I couldn't load data using the COPY command because I was unable to create column names with any upper case characters. COPY works fine from one redshift table to another, regardless of case, but COPY from S3 does not.
I tried double quotes around column names, but Redshift ignored them, while my local postgres instance worked just fine and used my literal quoted strings with mixed case.
For reference, my question (and answer) are here.
Does case matter when 'auto' loading data from S3 into a Redshift table?
Hope that helps, even if it's not the answer you were hoping for.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With