I am running airflow (1.10.1) inside a VM on GCP via docker. Already changed the local time of my VM and config (airflow.cfg) also set the default_zone of my country (America / Sao_Paulo) but it still continues in UTC time on the home screen and consequently processing is done in UTC too. Can you do anything else?
Complementing the given answer, I was able to change the execution according to my timezone inside the DAG through the code below:
import pendulum
default_args = {
'owner': 'airflow',
'start_date': pendulum.datetime(year=2019, month=7, day=26).astimezone('America/Sao_Paulo'),
'depends_on_past': False,
'email': ['[email protected]'],
'email_on_failure': False,
'email_on_retry': False,
'depends_on_past': False,
# If a task fails, retry it once after waiting
# at least 5 minutes
'retries': 1,
'retry_delay': timedelta(minutes=5),
'on_failure_callback': slack_msg
}
dag = DAG(
dag_id=nm_dag,
default_args=default_args,
schedule_interval='40 11 * * *',
dagrun_timeout=timedelta(minutes=60)
)
From the documentation:
Support for time zones is enabled by default. Airflow stores datetime information in UTC internally and in the database. It allows you to run your DAGs with time zone dependent schedules. At the moment Airflow does not convert them to the end user’s time zone in the user interface. There it will always be displayed in UTC. Also templates used in Operators are not converted.
Time zone information is exposed and it is up to the writer of DAG to process it accordingly.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With