I am testing my Azure function (v2 targeting .Net Core) with QueueTrigger locally with the following configs in host.json file
"queues": { "batchSize": 1, "newBatchThreshold": 0 }
The intent is to limit each Function App instance to only process one queue msg at a time.
According to this Azure function doc,
If you want to minimize parallel execution for queue-triggered functions in a function app, you can set the batch size to 1. This setting eliminates concurrency only so long as your function app runs on a single virtual machine (VM).
In host.json file, have these configs
{ "queues": { "maxPollingInterval": 2000, "visibilityTimeout" : "00:00:30", "batchSize": 16, "maxDequeueCount": 5, "newBatchThreshold": 8 } }
In our case, I'm not trying to eliminate concurrency, but I am trying to make sure each function app instance will only process one queue msg at a time. Then if we scale out the function app to run on multiple VMs, each VM is guaranteed to only process one queue msg at a time. To be more specific, the plan is to run the azure function under App Service plan, instead of Consumption plan (b/c you have very little control with the Consumption plan), and set the Scale Out rule to monitor a queue, up to N number of instances (VMs). This setup allows us to dedicate each VM to run ONE azure function app instance at a time, up to N VMs.
When I'm testing this locally, my azure function always grabs multiple msgs from the queue at the same time, even with the "BatchSize: 1" config in host.json file. I'm wondering if it's b/c I'm testing this in the local Azure function run time. I have not tested this in Azure yet. Hopefully it works as expected in Azure.
The issue turned out to be that "queues" wasn't nested under "extensions"
Example:
{
"version": "2.0",
"extensions": {
"queues": {
"maxPollingInterval": "00:00:02",
"visibilityTimeout" : "00:00:30",
"batchSize": 16,
"maxDequeueCount": 5,
"newBatchThreshold": 8
}
}
}
The referenced extension (Microsoft.Azure.WebJobs.Extensions.Storage
should also be at least 3.0.1
for this case, as there was previously a bug with setting newBatchThreshold
to 0.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With