I am writing psql through Amazon Redshift and now I am trying to save the output as CSV through PSQL query, on SQL Workbench
The reason I am planning to do this through query instead of using select
clause and then right click to save the output as csv, is because there are large amount of data, I found that if I generate the output into a temp table, it's much much faster than using select
to display all the output. Therefore, I am thinking whether saving to local CSV can be faster too.
I have tried the top solution here, however, it doesn't work on Amazon Redshift, When I am using Copy (SELECT col1, col2 FROM my_table) TO '[my local csv path]' WITH CSV DELIMITER ',';
, or tried \Copy
, it kept showing me
Amazon Invalid operation: syntax error at or near "("
or
Amazon Invalid operation: syntax error at or near "\"
Then I have checked Amazon Redshift query tutorial, didn't find any clause that could save the output to local CSV. It seems that COPY
is to copy data from an Amazon data source to Redshift, UNLOAD
is to save data to s3, but I just want to save the data on my local machine.
So, is there any way to save the Redshift output to my local CSV but with SQL Workbench?
Connect to Redshift through the SQL Gateway In MySQL Workbench, click to add a new MySQL connection. Name the connection (CData SQL Gateway for Redshift). Set the Hostname, Port, and Username parameters to connect to the SQL Gateway. Click Store in Vault to set and store the password.
Create an Amazon S3 bucket and then upload the data files to the bucket. Launch an Amazon Redshift cluster and create database tables. Use COPY commands to load the tables from the data files on Amazon S3. Troubleshoot load errors and modify your COPY commands to correct the errors.
Try running any one of the following in the Workbench
WbExport -type=text
-file='C:\Downloads\myData.txt'
-delimiter='\t'
-decimal=','
-dateFormat='yyyy-MM-dd';
select a, b ,c from myTable;
WbExport -type=text
-file='C:\Downloads\myQuery.txt'
-delimiter='\t'
-header=true
-tableWhere="WHERE a is not null and date between 11/11/11 and 22/22/22"
-sourcetable=mytable;
Yes there is, try this out.
PGPASSWORD=<password> psql -h <host> -d <db> -U <user> -p 5439-a -c "select * from <table>" -F '<delimiter>' -o temp.csv
I know your question is about Workbench but if you are willing to go to command line on linux possibly this is a solution, it's working nicely for us
#!/bin/zsh
#we are assuming you are not appending to each file and you don't need header
out_put='/tmp/output.csv'
#be very careful here rm -rf is very dangerous
rm -rf $out_put
PGPASSWORD='YOUR_PASSWORD' psql -h YOUR_STUFF-cluster-1.YOUR_STUFF.us-east-1.redshift.amazonaws.com -p YOUR_PORT_NUMBER -d YOUR_DATABASE -U YOUR_USER_NAME -A -t -c "select * from SOME_TABLE limit 10" -F ',' -o $out_put
echo "your file is ready" $out_put
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With