I am running Airflow in docker container. I have created a separate container to run Postgres server and Rabbitmq server, connected these containers using docker network - by following this nice article. Now my Airflow docker container is running and connected to other containers using docker network - the process went smooth so far. The problem is how to run airflow webserver
, airflow scheduler
and airflow worker
in the same container. After some research I found: it is recommended to run one service in one container. Now I have two solutions
broker_url = 'amqp://guest:guest@ksaprice_rabbitmq:8080//'
, celery_result_backend = db+postgresql://developer:user889@ksaprice_postgres:5432/airflow
. These settings refer to either database or rabbitmq which are already running different containers - they do not refer to ip/url which runs celery and scheduler and I assuming it is this way because celery and scheduler runs on the airflow server.My questions are:
airflow webserver
, airflow scheduler
and airflow worker
commands in the same Airflow container?I am new bee to Airflow and Docker.
After spending lot of time I found the following answers:
docker exec -it airflow_container bash
, now CLI will be attached to airflow_container then run airflow worker
. Repeat the same process for airflow scheduler
and airflow flower
. Now you will have three different CLIs running three services on the same airflow_container - this is the simplest way I found .airflow webserver --hostname=some_host --port=some_port
and airflow flower --hostname=some_host --port=some_port
to run them on different severs. But for airflow worker
there are no options to run on different server - may be there is some other way to run worker on different server. If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With