DockerOperator has a parameter xcom_push which when set, pushes the output of the Docker container to Xcom:
t1 = DockerOperator(task_id='run-hello-world-container',
image='hello-world',
xcom_push=True, xcom_all=True,
dag=dag)
In the admin interface under Xcom, I can see these values with key return_value. However, how can I access them in the DAG?
If I try:
t1_email_output = EmailOperator(task_id='t1_email_output',
to='[email protected]',
subject='Airflow sent you an email!',
html_content={{ ti.xcom_pull(task_ids='return_value') }},
dag=dag)
I get Broken DAG: [PATH] name 'ti' is not defined.
If I try:
t1_email_output = EmailOperator(task_id='t1_email_output',
to='[email protected]',
subject='Airflow sent you an email!',
html_content=t1.xcom_pull(task_ids='return_value'),
dag=dag)
I get Broken DAG: [PATH] xcom_pull() missing 1 required positional argument: 'context'.
You need to pass the task id from which you are pulling the xcom and not the variable name In your example it would be
{{ ti.xcom_pull('run-hello-world-container') }}
Also in the second snippet it should be "ti" instead of "t1"
html_content=ti.xcom_pull('run-hello-world-container'),
I found the problem - turns out I was missing a quote and my parameter was also wrong:
t1_email_output = EmailOperator(task_id='t1_email_output',
to='[email protected]',
subject='Airflow sent you an email!',
html_content="{{ ti.xcom_pull(key='return_value') }}",
dag=dag)
Sends an email with the Docker container's output like I expect.
I think what is happening is that the {{ }} syntax gets processed as a Jinja template by Airflow when the DAG is run, but not when it is loaded. So if I don't put the quotes around it, Airflow gets Python exceptions when it tries to detect and load the DAG, because the template hasn't been rendered yet. But if the quotes are added, the templated expression is treated as a string, and ignored by Python interpreter when being loaded by Airflow. However when the EmailOperator is actually triggered during a DAG run, the template is rendered into actual references to the relevant data.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With