I'm trying to import a csv-file into a psql database. After reading about the difference between COPY
and \copy
I get this error when executing my script.
Here's my code cutting:
try:
csv_data = os.path.realpath('test.csv')
con = psycopg2.connect(database = 'db01', user = 'postgres')
cur = con.cursor()
cur.execute("\copy stamm_data from '%s' DELIMITER ';' csv header" % csv_data)
con.commit()
And here's the error:
Error: syntax error at or near "\"
LINE 1: \copy stamm_data from '/home/jw/dev/test.csv' delimiter ';' ...
^
When using COPY
I get:
Error: could not open file "/home/jw/dev/test.csv" for reading: Permission denied
Though the psql user 'postgres' is a superuser and the ubuntu user to run the script has reading permissions on the test.csv file.
Any ideas?
You can fetch data from PostgreSQL using the fetch() method provided by the psycopg2. The Cursor class provides three methods namely fetchall(), fetchmany() and, fetchone() where, The fetchall() method retrieves all the rows in the result set of a query and returns them as list of tuples.
How to Connect to PostgreSQL from Python? In order to connect to a PostgreSQL database instance from your Python script, you need to use a database connector library. In Python, you have several options that you can choose from. Some libraries that are written in pure Python include pg8000 and py-postgresql.
Okay, here we go.. The solution with copy_from
works fine - I remove the header from the csv-file and import that file with copy_from
.
But now I'm running into the following error:
Error: null value in column "id" violates not-null constraint
DETAIL: Failing row contains (null, foo01, ACE001, 3).
My table has the following columns:
ID, hotelcode, hotelname, stars
ID is my PK, so I cant remove the NOT NULL modifier. How do I import the ID's for each line? Or how do I say Postgres to fill the column ID with the value 'DEFAULT', so that the database generates the id's by itself?
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With