Trying to dump the output of a query into a CSV file in an automated job and running into an issue with fields where the column contains my comma delimiter. With the nature of this particular network, I have to jump through a couple of hoops to get things done, and there's a good chance I'm missing something very obvious.
In a nutshell, I kick off my script from a client machine that uses PLINK to run a remote psql command on another box over an SSH connection. That psql command is hitting a Postgres server on a third machine (I can't connect directly from client to DB, hence the extra step in between).
If I manually SSH from client to server 1, connect to the Postgres box, and use \copy... with CSV header, the file that's created is perfect, and any fields that contain a comma are automatically surrounded by double quotes.
However, if I try go issue that \copy (or copy) command in a single command, the output doesn't contain those double quotes, so I end up in that situation where commas in a field are interpreted as a delimiter later one.
In other words, this has the necessary double-quotes:
But doesn't:
psql -Uuser -h server2 database -c "\copy (select ...) to '~/myfile.csv' with CSV header;"
FORCE_QUOTE
Here is how to do:
psql -U user -h server2 database -c "\copy (select ...) to '~/myfile.csv' WITH (FORMAT CSV, HEADER TRUE, FORCE_QUOTE *);"
COPY
command documentation
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With