How can I add my own logs onto the Apache Airflow logs that are automatically generated? any print statements wont get logged in there, so I was wondering how I can add my logs so that it shows up on the UI as well?
Users can specify the directory to place log files in airflow. cfg using base_log_folder . By default, logs are placed in the AIRFLOW_HOME directory.
You can also view the logs in the Airflow web interface. Streaming logs: These logs are a superset of the logs in Airflow. To access streaming logs, you can go to the logs tab of Environment details page in Google Cloud console, use the Cloud Logging, or use Cloud Monitoring. Logging and Monitoring quotas apply.
The service logs are available at /media/ephemeral0/logs/airflow location inside the cluster node. Since airflow is single node machine, logs are accessible on the same node. These logs are helpful in troubleshooting cluster bringup and scheduling issues.
To enable custom logging config a configuration file ~/airflow/config/log_config.py has to be created in which modifications to DEFAULT_LOGGING_CONFIG are specified. You may need to do this if you want to, for example, add a custom handler.
I think you can work around this by using the logging module and trusting the configuration to Airflow.
Something like:
import ... dag = ... def print_params_fn(**kwargs): import logging logging.info(kwargs) return None print_params = PythonOperator(task_id="print_params", python_callable=print_params_fn, provide_context=True, dag=dag)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With