I'm building an app running on NodeJS using postgresql. I'm using SequelizeJS as ORM. To avoid using real postgres daemon and having nodejs on my own device, i'm using containers with docker-compose.
when I run docker-compose up
it starts the pg database
database system is ready to accept connections
and the nodejs server. but the server can't connect to database.
Error: connect ECONNREFUSED 127.0.01:5432
If I try to run the server without using containers (with real nodejs and postgresd on my machine) it works.
But I want it to work correctly with containers. I don't understand what i'm doing wrong.
here is the docker-compose.yml
file
web: image: node command: npm start ports: - "8000:4242" links: - db working_dir: /src environment: SEQ_DB: mydatabase SEQ_USER: username SEQ_PW: pgpassword PORT: 4242 DATABASE_URL: postgres://username:[email protected]:5432/mydatabase volumes: - ./:/src db: image: postgres ports: - "5432:5432" environment: POSTGRES_USER: username POSTGRES_PASSWORD: pgpassword
Could someone help me please?
(someone who likes docker :) )
Fill the port value as 5432 that runs the Docker PostgreSQL Container and provide the name of the database as postgres. Then, fill the username and password fields with the credentials you created while running the PGAdmin container. After providing all required details, click on the “Save” button.
Using node-postgres , you will be able to write Node. js programs that can access and store data in a PostgreSQL database. In this tutorial, you'll use node-postgres to connect and query the PostgreSQL (Postgres in short) database. First, you'll create a database user and the database in Postgres.
In Docker for Mac and Docker for Windows, you can connect to the host out of the box by using the special DNS name: host.docker.internal. For Linux, you need the magic string host-gateway to map to the gateway inside the container. This allows you to use the hostname host. docker.
Your DATABASE_URL
refers to 127.0.0.1
, which is the loopback adapter (more here). This means "connect to myself".
When running both applications (without using Docker) on the same host, they are both addressable on the same adapter (also known as localhost
).
When running both applications in containers they are not both on localhost as before. Instead you need to point the web
container to the db
container's IP address on the docker0
adapter - which docker-compose
sets for you.
Change:
127.0.0.1
to CONTAINER_NAME
(e.g. db
)
Example:
DATABASE_URL: postgres://username:[email protected]:5432/mydatabase
to
DATABASE_URL: postgres://username:pgpassword@db:5432/mydatabase
This works thanks to Docker links: the web
container has a file (/etc/hosts
) with a db
entry pointing to the IP that the db
container is on. This is the first place a system (in this case, the container) will look when trying to resolve hostnames.
For further readers, if you're using Docker desktop for Mac
use host.docker.internal
instead of localhost
or 127.0.0.1
as it's suggested in the doc. I came across same connection refused...
problem. Backend api-service
couldn't connect to postgres
using localhost/127.0.0.1
. Below is my docker-compose.yml and environment variables as a reference:
version: "2" services: api: container_name: "be" image: <image_name>:latest ports: - "8000:8000" environment: DB_HOST: host.docker.internal DB_USER: <your_user> DB_PASS: <your_pass> networks: - mynw db: container_name: "psql" image: postgres ports: - "5432:5432" environment: POSTGRES_DB: <your_postgres_db_name> POSTGRES_USER: <your_postgres_user> POSTGRES_PASS: <your_postgres_pass> volumes: - ~/dbdata:/var/lib/postgresql/data networks: - mynw
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With