I am not able see the logs attached to the tasks from the Airflow UI:
Log related settings in airflow.cfg file are:
remote_base_log_folder =
base_log_folder = /home/my_projects/ksaprice_project/airflow/logs
worker_log_server_port = 8793
child_process_log_directory =
/home/my_projects/ksaprice_project/airflow/logs/scheduler
Although I am setting remote_base_log_folter it is trying to fetch the log from http://:8793/log/tutorial/print_date/2017-08-02T00:00:00
- I don't understand this behavior. According to the settings the workers should store the logs at /home/my_projects/ksaprice_project/airflow/logs
and they should be fetched from the same location instead of remote.
Update task_instance table content:
If you run Airflow locally, logging information is accessible in the following locations: Scheduler: Logs are printed to the console and accessible in $AIRFLOW_HOME/logs/scheduler . Webserver and Triggerer: Logs are printed to the console. Task: Logs can be viewed in the Airflow UI or at $AIRFLOW_HOME/logs/ .
If you run Airflow locally, logging information will be accessible in the following locations: Scheduler logs are printed to the console and accessible in $AIRFLOW_HOME/logs/scheduler . Webserver and Triggerer logs are printed to the console. Task logs can be viewed either in the Airflow UI or at $AIRFLOW_HOME/logs/ .
Many common logging libraries, such as log4j, offer log rotation strategies to clear out older logs. However, Airflow does not utilize anything like it.
I also faced the same problem.
Setting below variables in airflow.cfg
worked for me. Use {hostname}
as machine's FQDN {hostname}
instead of localhost.
endpoint_url = http://{hostname}:8080
base_url = http://{hostname}:8080
Best of luck!
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With