My goal is to efficiently import large amounts of data into a Postgres Database. In principle, the raw data could be compressed by a factor of ~20 (e.g. using gzip).
The COPY statement seems to be the best option for a bulk import.
Apart from sslcompression (which is applied after the data is encrypted), is there a way to compress the actual data (content) transferred between client and server, or is that even built-in by default?
Many thanks.
(Should not matter, but I am using golang).
the COPY [TO|FROM] PROGRAM allows to use gzip as a program, if it is installed somewhere accessible to the postgres server process: https://www.postgresql.org/docs/current/sql-copy.html#id-1.9.3.55.10.
You could also use COPY TO|FROM STDIN and do the de/compression client-side.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With