I want to copy a CSV file to a Postgres table. There are about 100 columns in this table, so I do not want to rewrite them if I don't have to.
I am using the \copy table from 'table.csv' delimiter ',' csv;
command but without a table created I get ERROR: relation "table" does not exist
. If I add a blank table I get no error, but nothing happens. I tried this command two or three times and there was no output or messages, but the table was not updated when I checked it through PGAdmin.
Is there a way to import a table with headers included like I am trying to do?
A header of the CSV file is an array of values assigned to each of the columns. It acts as a row header for the data. Initially, the CSV file is converted to a data frame and then a header is added to the data frame. The contents of the data frame are again stored back into the CSV file.
This worked. The first row had column names in it.
COPY wheat FROM 'wheat_crop_data.csv' DELIMITER ';' CSV HEADER
With the Python library pandas
, you can easily create column names and infer data types from a csv file.
from sqlalchemy import create_engine import pandas as pd engine = create_engine('postgresql://user:pass@localhost/db_name') df = pd.read_csv('/path/to/csv_file') df.to_sql('pandas_db', engine)
The if_exists
parameter can be set to replace or append to an existing table, e.g. df.to_sql('pandas_db', engine, if_exists='replace')
. This works for additional input file types as well, docs here and here.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With