I have a REST API where I would like to cache the JSON response of the index (GET /foo) and the read actions (GET /foo/1) to significantly increase the performance. When there is a POST or a PUT on a resource the cache entries for the index and read results need to be expired, so no old content is served.
Is this a scenario that's best done with a Reverse proxy like Squid / Varnish or would you choose memcache(d)?
Reverse proxy Cache stores copies of data or files in a temporary storage location so they can be accessed faster. It temporarily saves data for applications, servers, and web browsers, which ensures users need not download information every time they access a website or application.
Caching in REST APIs POST requests are not cacheable by default but can be made cacheable if either an Expires header or a Cache-Control header with a directive, to explicitly allows caching, is added to the response. Responses to PUT and DELETE requests are not cacheable at all.
If some recurring requests produce the same response, we can use a cached version of the response to avoid the excessive load.
Proxy caching is a feature of proxy servers that stores content on the proxy server itself, allowing web services to share those resources to more users. The proxy server coordinates with the source server to cache documents such as files, images and web pages.
Using a reverse proxy that sits on the HTTP layer is more transparent. That means that it's possible to see what's going on over the wire. The bad thing is that few of these support caching authenticated responses, so their efficiency may drop to 0 if your resources require authentication. Reverse proxies also don't usually automatically expire resource A (/foo
) when this completely unrelated resource B (/foo/1
) is modified. That's correct behaviour that you'd have to add to your solution somehow.
Both of these problems can be solved if you use memcached, since it doesn't have the transparency requirement.
I would go for a reverse proxy like varnish because you can implement (and test) your service without involving cache logic, and add caching as a separate layer. You can upgrade/restart your service while varnish serves old results for GET request (great for availability), and it's easy to setup rules in varnish to invalide (purge) existing cache results based on specific GET/POST actions.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With