I have the following docker-compose.yml
file:
version: '3'
services:
postgres:
image: postgres
container_name: postgres
ports:
- "5431:5432"
environment:
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=postgres
- POSTGRES_DB=anime
volumes:
- ./init.sql:/docker-entrypoint-initdb.d/init.sql
This configuration starts a Postgres database. In volume I defined init.sql, which should set up a table:
CREATE TABLE anime (
anime_id INT PRIMARY KEY,
title TEXT
);
Then, I would like to fill the Postgres database with data from the CSV file.
I tried to add another volume to docker-compose:
- ./preload.sql:/preload/preload.sql
with that script:
copy anime FROM 'docker/data/AnimeList.csv' DELIMITER ',' CSV HEADER;
The CSV file is located in the data
folder, on the same level as docker-compose.yml
.
But it is not working. The database is created correctly, but it doesn't have the table and data. When I connect to Docker container, run 'psql
command and try to get anime
table, I get the following error:
Did not find any relation named "anime".
My question is: how to preload the Postgres container with the CSV data file in docker-compose?
Install PostgreSQL on Docker Start by creating a new Dockerfile : Note: This PostgreSQL setup is for development-only purposes. Refer to the PostgreSQL documentation to fine-tune these settings so that it is suitably secure. Build an image from the Dockerfile and assign it a name.
Both the Dockerfile and docker-compose are important resources in the development and deployment of cloud-native applications. But knowing the difference between docker-compose and the Dockerfile is important. The Dockerfile is used to build images, while docker-compose helps you run them as containers.
I managed to make it work using custom Dockerfile
, here's my solution:
data/
datasource.csv
db/
scripts/
1_init.sql
2_copy.sql
Dockerfile
docker-compose.yml
CSV
file is located in data
folder inside of the project.
In the project folder there is the following docker-compose.yml
file:
version: '3.3'
services:
db:
build: ./db
container_name: postgres
ports:
- "5431:6666"
environment:
- POSTGRES_USER=postgres
- POSTGRES_PASSWORD=postgres
- POSTGRES_DB=db_name
volumes:
- ./data:/data
Dockerfile
contains:
FROM postgres:alpine
ADD scripts/1_init.sql /docker-entrypoint-initdb.d
ADD scripts/2_copy.sql /docker-entrypoint-initdb.d
RUN chmod a+r /docker-entrypoint-initdb.d/*
EXPOSE 6666
1_init.sql
body:
CREATE TABLE table_name
(
--statement body
);
And 2_copy.sql
:
COPY table_name FROM '/data/datasource.csv' DELIMITER ',' CSV HEADER;
1_init.sql
creates the DB table, it has to have the same column names as in CSV file. 2_copy.sql
is responsible for copying data from the CSV to postgres.
Dockerfile
uses postgres image and copies all *.sql
files to /docker-entrypoint-initdb.d/
. Later, all files are executed in alphanumerical order, that's why *.sql
files start with digits. Finally, port 6666
is exposed.
docker-compose.yml
builds the Dockerfile
from db
folder and make it accessible through 5431
port. As environmental properties basic postgres properties are used. And at the end data
folder with CSV file is copied to the container.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With