I'm trying to write our first Airflow DAG, and I'm getting the following error when I try to list the tasks using command airflow list_tasks orderwarehouse
:
Traceback (most recent call last): File "/usr/local/lib/python2.7/site-packages/airflow/models.py", line 2038, in resolve_template_files setattr(self, attr, env.loader.get_source(env, content)[0]) File "/usr/local/lib/python2.7/site-packages/jinja2/loaders.py", line 187, in get_source raise TemplateNotFound(template) TemplateNotFound: ./home/deploy/airflow-server/task_scripts/orderwarehouse/load_warehouse_tables.sh
This DAG is not supposed to use a template. I'm only trying to run the shell script in the specified location per the instructions in the docs. The shell script does exist in that location and is spelled correctly. My DAG looks like this:
from airflow import DAG from airflow.operators.bash_operator import BashOperator default_args = { 'owner': 'airflow', 'depends_on_past': False, 'start_date': datetime(2015, 6, 1), 'email': ['[email protected]'], 'email_on_failure': False, 'email_on_retry': False, 'retries': 1, 'retry_delay': timedelta(minutes=5), # 'queue': 'bash_queue', # 'pool': 'backfill', # 'priority_weight': 10, # 'end_date': datetime(2016, 1, 1), } orderwarehouse = DAG('orderwarehouse', default_args=default_args) load_mysql = BashOperator( task_id='load_warehouse_mysql', bash_command='./home/deploy/airflow-server/task_scripts/orderwarehouse/load_warehouse_tables.sh', dag=orderwarehouse)
Not sure why it thinks it needs to look for a Jinja template. Running out of ideas on this one, would appreciate if anyone can point me to where I'm going astray. Thanks.
The Airflow BashOperator does exactly what you are looking for. It is a very simple but powerful operator, allowing you to execute either a bash script, a command or a set of commands from your DAGs.
LS SERIES. Description: LS airflow sensors are cooling effect monitors. that provide a positive indication of the presence of airflow. Independent of the air temperature, these units are used to operate an alarm or shutdown device when airflow drops below a preset level.
Bash has a large set of logical operators that can be used in conditional expressions. The most basic form of the if control structure tests for a condition and then executes a list of program statements if the condition is true. There are three types of operators: file, numeric, and non-numeric operators.
This is a pitfall of airflow. Add a space at the end of your bash_command and it should run fine
Source: https://cwiki.apache.org/confluence/pages/viewpage.action?pageId=62694614
You should try with space at the end of filepath. whichever operator you are using you should always follow the same rule.
load_mysql = BashOperator( task_id='load_warehouse_mysql', command='/home/deploy/airflow-server/task_scripts/orderwarehouse/load_warehouse_tables.sh ', dag=orderwarehouse)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With