Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Unable to use json body of gcp cloud scheduler in cloud function as parameter value?

I have a cloud scheduler which I am using to trigger my cloud function as http call, In my cloud function I would like to form a query (which should be dynamic). To do so, I am passing some parameter from cloud scheduler (Json Body), but when I trigger my cloud function it doesn't take parameter values which are coming from cloud scheduler as json body. Can anyone help me to resolve this issue.

json body from cloud scheduler:

{ 
   "unit":"QA",
   "interval":"3"
}

Cloud function code:

def main(request):

    request_json = request.get_json(silent=True)
    request_args = request.args

    if request_json and 'unit' in request_json:
        retail_unit = request_json['unit']
    elif request_args and 'unit' in request_args:
        retail_unit = request_args['unit']
    else:
        unit = 'UAT'

    if request_json and 'interval' in request_json:
        interval = request_json['interval']
    elif request_args and 'interval' in request_args:
        interval = request_args['interval']
    else:
        interval = 1

    query = "select * from `myproject.mydataset.mytable` where unit='{}' and interval ={}".format(                                                                                                    
    unit,interval)
    client = bigquery.Client()
    job_config = bigquery.QueryJobConfig()
    dest_dataset = client.dataset(destination_dataset, destination_project)
    dest_table = dest_dataset.table(destination_table)
    job_config.destination = dest_table
    job_config.create_disposition = 'CREATE_IF_NEEDED'
    job_config.write_disposition = 'WRITE_APPEND'
    job = client.query(query, location='US', job_config=job_config)
    job.result()

Note: It works when I pass same variables from cloud scheduler as argument values in http url (https://my-region-test-project.cloudfunctions.net/mycloudfunction?unit=QA&interval=3)

like image 956
Kaustubh Ghole Avatar asked Sep 24 '19 20:09

Kaustubh Ghole


People also ask

How do you trigger a pub/sub topic?

For Cloud Functions (1st gen): In the Trigger type field, select Cloud Pub/Sub. In the Select a Cloud Pub/Sub topic field, select a topic for the trigger to monitor. Messages published to this topic will trigger calls to your function.

Why does cloud deployment fail?

Cloud Functions deployment can fail if the entry point to your code, that is, the exported function name, is not specified correctly. Your source code must contain an entry point function that has been correctly specified in your deployment, either via Cloud console or Cloud SDK.

Which triggers are supported by cloud functions select all that apply?

GCF supports the following trigger types: HTTP, Cloud Pub/Sub, and other sources like Firebase. HTTP events trigger HTTP functions, and all other event types trigger background functions. HTTP Functions pass the ExpressJS parameters (request, response).


2 Answers

You can override the default Content-Type by creating the cron job using gcloud with the flag --headers Content-Type=application/json.

For instance:

gcloud scheduler jobs create http my_cron_job \
  --schedule="every 5 hours" \
  --uri="https://${ZONE}-${PROJECT_ID}.cloudfunctions.net/${FUNCTION_NAME}" \
  --http-method=POST \
  --message-body='{"foo": "bar"}' \
  --headers Content-Type=application/json

This doesn't seem to be available from the GCP Console level yet. Update 08/2021: It seems that it has been implemented in the UI now:

scheduler headers


Alternatively, using force=True seems to help if you're using Flask:

request.get_json(force=True)

This is due to the fact that Cloud Scheduler seems to set the default Content-Type header to application/octet-stream. See the doc

like image 178
Voy Avatar answered Oct 01 '22 23:10

Voy


The best hint is UTF-8 issue.

Check out also the situations described in this other thread: HTTP Triggering Cloud Function with Cloud Scheduler

like image 39
Pentium10 Avatar answered Oct 02 '22 01:10

Pentium10