We have an automated FTP process set up which imports a data file into Google Cloud Storage daily.
I would like to set up a daily automated job that uploads this csv into a bigquery table.
What is the best way to do this? My current first thought is to set up an app engine instance with a cron job that runs a python script every day. Is there a better solution?
Background Cloud Function
with a Cloud Storage trigger
is your best choice!
You can set it to monitor specific bucket for new files and execute load script whenever trigger is fired
Forgot to mention - Cloud Functions support (as of now) only node.js for scripting - which usually not a problem but just wanted to mention :o)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With