Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Ignore duplicates when importing from CSV

I'm using PostgreSQL database, after I've created my table I have to populate them with a CSV file. However the CSV file is corrupted and it violates the primary key rule and so the database is throwing an error and I'm unable to populate the table. Any ideas how to tell the database to ignore the duplicates when importing from CSV? Writing a script to remove them from the CSV file is no acceptable. Any workarounds are welcome too. Thank you! : )

like image 723
Anton Belev Avatar asked Oct 28 '25 01:10

Anton Belev


1 Answers

On postgreSQL, duplicate rows are not permitted if they violate a unique constraint. I think that your best option, is to import your CSV file on to temp table that has no constraint, delete from it duplicate values, and finally import from this temp table to your final table.

like image 74
Houari Avatar answered Oct 30 '25 19:10

Houari