I have multiple records :
{
"item_id":1",
key1: "data1"
}
{
item_id:2
key1: "data1"
}
{
item_id:2
key1: "data1"
}
{
item_id:1
key1: "data1"
}
I do not want to process them sequentially.There can be more than 200 records. Should I process them using for-each parallel or scatter-gather. Which approach would be best as per my requirement. I do not need the accumulated response, but if there is some exception while processing (hit an api for each record based on an if condition) any one of the record,processing of other records must remain unaffected.
Why not then use the VM module, break the collection into its individual records and push them to a VM queue? Then have another flow with a VM listener picking up the individual records (in parallel) and processing them. Here's more details: https://docs.mulesoft.com/mule-runtime/4.2/reliability-patterns
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With