Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Airflow kills my tasks after 1 minute

I have a very simple DAG with two tasks, like following:

default_args = {
    'owner': 'me',
    'start_date': dt.datetime.today(),
    'retries': 0,
    'retry_delay': dt.timedelta(minutes=1)
}

dag = DAG(
    'test DAG',
    default_args=default_args,
    schedule_interval=None
)

t0 = PythonOperator(
    task_id="task 1",
    python_callable=run_task_1,
    op_args=[arg_1, args_2, args_3],
    dag=dag,
    execution_timeout=dt.timedelta(minutes=60)
)

t1 = PythonOperator(
    task_id="task 2",
    python_callable=run_task_2,
    dag=dag,
    execution_timeout=dt.timedelta(minutes=60)
)

t1.set_upstream(t0)

However, when I run it, I see the following in the logs:

[2017-10-17 16:18:35,519] {jobs.py:2083} INFO - Task exited with return code -9

Without any other useful error logs. Anyone seen that before? Did I define my DAG wrongly? Any help appreciated!

like image 526
kassnl Avatar asked Oct 17 '17 16:10

kassnl


1 Answers

If the task container doesn't have enough memory for a task, it will fail with error code -9. https://www.astronomer.io/guides/dag-best-practices/

like image 125
Agastya Avatar answered Sep 28 '22 08:09

Agastya