I have a web service that is running on a cluster of servers. This web service does some internal processing, and then may make a call out to an external service that incurs a fee.
I want to put in some caching so that if I receive identical requests to the service (which is guaranteed to happen), then I do not have to repeat the processing, saving both processing time/power and also the cost incurred in the external part of the service call.
However, I am struggling to figure out how to manage this caching when I have the following constraints
How can I hold off on executing the other service calls, until the first one has responded (therefore available in the cache), when working in a distributed environment.
I have thought about putting in a front-proxy pattern and building up a queue of identical requests within the proxy, so that when the first returns, it can also return the same response to the others. Is this the correct pattern, or is there a better concurrency pattern that deals with this scenario?
You could
At step 2, if the hash already is in the database, with the "result pending" status, you could poll the database every X milliseconds and finally return the result once it's there.
The devil is in the details, of course, because you would have to decide what you do in case an error occurs:
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With