We use DBT with GCP and BigQuery for transformations in BigQuery, and the simplest approach to scheduling our daily run dbt
seems to be a BashOperator
in Airflow. Currently we have two separate directories / github projects, one for DBT and another for Airflow. To schedule DBT to run with Airflow, it seems like our entire DBT project would need to be nested inside of our Airflow project, that way we can point to it for our dbt run
bash command?
Is it possible to trigger our dbt run
and dbt test
without moving our DBT directory inside of our Airflow directory? With the airflow-dbt package, for the dir
in the default_args
, maybe it is possible to point to the gibhub link for the DBT project here?
My advice would be to leave your dbt and airflow codebases separated. There is indeed a better way:
DockerOperator
in your airflow DAG to run that docker image with your dbt codeI'm assuming that you use the airflow LocalExecutor here and that you want to execute your dbt run
workload on the server where airflow is running. If that's not the case and that you have access to a Kubernetes cluster, I would suggest instead to use the KubernetesPodOperator
.
Accepted the other answer based on the consensus via upvotes and the supporting comment, however I'd like to post a 2nd solution that I'm currently using:
dbt
and airflow
repos / directories are next to each other.docker-compose.yml
, we've added our DBT directory as a volume so that airflow has access to it.Dockerfile
, install DBT and copy our dbt
code.BashOperator
to run dbt
and test dbt
.If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With