I recently upgraded from airflow 1.9 to 1.10 and performed the following commands:
Jobs seem to be running fine, but when I click logs don't appear in DAG task nodes.
I opened my network tab and a request to the following url is returning this JSON
$AIRFLOW_URL/ariflow/get_logs_with_metadata?dag_id=xxxx&task_id=xxxxx&execution_date=2018-09-09T23%3A03%3A10.585986%2B00%3A00&try_number=1&metadata=null
{"error":true,"message":["Task log handler file.task does not support read logs.\n'NoneType' object has no attribute 'read'\n"],"metadata":{"end_of_log":true}}
Additionally there is a 404 request to get js/form-1.0.0.js. Any advice on extra steps to get logs reworking?
I can confirm that logs are showing up in the logs directory for tasks on the airflow server.
You can also view the logs in the Airflow web interface. Streaming logs: These logs are a superset of the logs in Airflow. To access streaming logs, you can go to the logs tab of Environment details page in Google Cloud console, use the Cloud Logging, or use Cloud Monitoring. Logging and Monitoring quotas apply.
The main Apache Airflow log files are at the /opt/bitnami/airflow/logs/ directory in each virtual machine.
December 10, 2020. We've just released Apache Airflow 1.10. 14.
On Airflow Whether you're developing locally or on Astronomer Cloud, you can check your Airflow version by: Logging into the Airflow UI. Navigate to About > Version.
Using https://github.com/apache/incubator-airflow/blob/master/airflow/config_templates/default_airflow.cfg
I previously had
task_log_reader = file.task
and changed it to:
task_log_reader = task
As well I added:
log_filename_template = {{ ti.dag_id }}}}/{{ ti.task_id }}/{{ ts }}}}/{{ try_number }}.log
log_processor_filename_template = {{ filename }}.log
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With