I'm running a ruby on rails application in docker container. I want to create and then restore the database dump in postgres container. But I'm
Below is what I've done so far:
1) Added bash script in /docker-entrypoint-initdb.d
folder. Script is just to create database:
psql -U docker -d postgres -c 'create database dbname;'
RESULT: Database created but rails server exited with code 0. Error: web_1 exited with code 0
2) Added script to be executed before docker-compose up
.
# Run docker db container
echo "Running db container"
docker-compose run -d db
# Sleep for 10 sec so that container have time to run
echo "Sleep for 10 sec"
sleep 10
echo 'Copying db_dump.gz to db container'
docker cp db_dump/db_dump.gz $(docker-compose ps -q db):/
# Create database `dbname`
echo 'Creating database `dbname`'
docker exec -i $(docker-compose ps -q db) psql -U docker -d postgres -c 'create database dbname;'
echo 'importing database `dbname`'
docker exec -i $(docker-compose ps -q db) bash -c "gunzip -c /db_dump.gz | psql -U postgres dbname"
RESULT: Database created and restored data. But another container runs while running web application server using docker-compose up
.
docker--compose.yml
:
version: '2'
services:
db:
image: postgres
environment:
- POSTGRES_PASSWORD=docker
- POSTGRES_USER=docker
web:
build: .
command: bundle exec rails s -p 3000 -b '0.0.0.0' -d
image: uname/application
links:
- db
ports:
- "3000:3000"
depends_on:
- db
tty: true
Can some one please help to create and import database?
EDIT:
I've tried one more approach by adding POSTGRES_DB=db_name
environment variable in docker-compose.yml
file so that database will be created and after running the application (docker-compose up
), I'll import the database. But getting an error: web_1 exited with code 0
.
I'm confused why I'm getting this error (in first and third approach), seems to be something is messed up in docker-compose
file.
You'll need to mount the dump into the container so you can access it. Something like this in docker-compose.yml:
db:
volumes:
- './db_dump:/db_dump'
Make a local directory named db_dump
and place your db_dump.gz
file there.
Use POSTGRES_DB
in the environment (as you mentioned in your question) to automatically create the database. Start db
by itself, without the rails server.
docker-compose up -d db
Wait a few seconds for the database to be available. Then, import your data.
docker-compose exec db gunzip /db_dump/db_dump.gz
docker-compose exec db psql -U postgres -d dbname -f /db_dump/db_dump.gz
docker-compose exec db rm -f /db_dump/db_dump.gz
You can also just make a script to do this import, stick that in your image, and then use a single docker-compose command to call that. Or you can have your entrypoint script check whether a dump file is present, and if so, unzip it and import it... whatever you need to do.
docker-compose up -d web
If you are doing this by hand for prep of a new setup, then you're done. If you need to automate this into a toolchain, you can do some of this stuff in a script. Just start the containers separately, doing the db import in between, and use sleep
to cover any startup delays.
web_1 exited with code 0
Did you tried check the log of web_1
container? docker-compose logs web
I strongly recommend you don't initialize your db container manually, make it automatically within the process of start container.
Look into the entrypoint of postgres, we could just put the db_dump.gz
into /docker-entrypoint-initdb.d/
directory of the container, and it will be automatic execute, so docker-compose.yml
could be:
db:
volumes:
- './initdb.d:/docker-entrypoint-initdb.d'
And put your db_dump.gz
into ./initdb.d
on your local machine.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With