I just started using apache airflow. I am trying to run test.sh file from airflow, however it is not work.
Following is my code, file name is test.py
import os
from airflow import DAG
from airflow.operators.bash_operator import BashOperator
from datetime import datetime, timedelta
default_args = {
'owner': 'airflow',
'depends_on_past': False,
'start_date': datetime(2015, 6, 1),
'email': ['[email protected]'],
'email_on_failure': False,
'email_on_retry': False,
'retries': 1,
'retry_delay': timedelta(minutes=5),
# 'queue': 'bash_queue',
# 'pool': 'backfill',
# 'priority_weight': 10,
# 'end_date': datetime(2016, 1, 1),
}
dag = DAG('test', default_args=default_args)
# t1 and t2 are examples of tasks created by instantiating operators
t1 = BashOperator(
task_id='print_date',
bash_command='date',
dag=dag)
create_command = "sh home/ubuntu/test/inst/scripts/test.sh"
if os.path.exists(create_command):
t2 = BashOperator(
task_id= 'cllTest',
bash_command=create_command,
dag=dag
)
else:
raise Exception("Cannot locate {}".format(create_command))
t2.set_upstream(t1)
when I run python ~/airflow/dags/test.py, it doesn't throw any error.
However, When I run airflow list_dag, it throws following error:
[2017-02-15 20:20:02,741] {__init__.py:36} INFO - Using executor SequentialExecutor
[2017-02-15 20:20:03,070] {models.py:154} INFO - Filling up the DagBag from /home/ubuntu/airflow/dags
[2017-02-15 20:20:03,135] {models.py:2040} ERROR - sh home/ubuntu/test/inst/scripts/test.sh
Traceback (most recent call last):
File "/home/ubuntu/anaconda2/lib/python2.7/site-packages/airflow/models.py", line 2038, in resolve_template_files
setattr(self, attr, env.loader.get_source(env, content)[0])
File "/home/ubuntu/anaconda2/lib/python2.7/site-packages/jinja2/loaders.py", line 187, in get_source
raise TemplateNotFound(template)
TemplateNotFound: sh home/ubuntu/test/inst/scripts/test.sh
I have tried to used How to run bash script file in Airflow for answer, it doesn't work
Where I am making mistake ?
First, a folder named with the current execution date will be created in the folder dags of Airflow. Next, the bash script command.sh will be copied from the dags folder into the new created folder with the execution date. Finally, the bash script is run.
LS SERIES. Description: LS airflow sensors are cooling effect monitors. that provide a positive indication of the presence of airflow. Independent of the air temperature, these units are used to operate an alarm or shutdown device when airflow drops below a preset level.
Apache Airflow's BashOperator is an easy way to execute bash commands in your workflow. If the DAG you wrote executes a bash command or script, this is the operator you will want to use to define the task. However, running shell scripts can always run into trouble with permissions, particularly with chmod .
Add a space after .sh it should work this is mentioned in the confluence page of airflow
t2 = BashOperator(
task_id='sleep',
bash_command="/home/batcher/test.sh", // This fails with `Jinja template not found` error
#bash_command="/home/batcher/test.sh ", // This works (has a space after)
dag=dag)
use just the script path, without "sh": create_command = "/home/ubuntu/test/inst/scripts/test.sh"
also make sure that "airflow" user has permissions to execute "test.sh" script.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With