Currently, I am using PostgreSQL 11.2 and I have a ~4GB .csv file. Firstly I am trying to create a temporary table and select the needed columns to fill my current table.
create temporary table t (identification varchar, a1 text, a2 text, a3 text, a4 text, a15 text, a6 text, a7 text, a8 text)
copy t
from 'C:\PostgreSqlData\mydata.csv'
delimiter ',' csv
If I get some smaller portion of data(~10MB), it does not give any errors. But, when I try to import the whole file, it gives:
could not stat file "mydata.csv”: unknown error
So, how do you open large CSV files in Excel? Essentially, there are two options: Split the CSV file into multiple smaller files that do fit within the 1,048,576 row limit; or, Find an Excel add-in that supports CSV files with a higher number of rows.
PgAdmin is a Graphical User Interface (GUI) that allows businesses to import data into PostgreSQL databases. With this service, you can convert CSV files into acceptable PostgreSQL database formats, and import the CSV into your PostgreSQL format.
PostgreSQL does not impose a limit on the total size of a database. Databases of 4 terabytes (TB) are reported to exist. A database of this size is more than sufficient for all but the most demanding applications.
Goto solution for bulk loading into PostgreSQL is the native copy command.
First check that psql is already installed.
Open your terminal
Run the psql command :
Try this way
psql -c "COPY tablename FROM 'C:\PostgreSqlData\mydata.csv' delimiter ',' csv;"
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With