Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

REST API caching, should I use a Reverse proxy or memcache(d)?

I have a REST API where I would like to cache the JSON response of the index (GET /foo) and the read actions (GET /foo/1) to significantly increase the performance. When there is a POST or a PUT on a resource the cache entries for the index and read results need to be expired, so no old content is served.

Is this a scenario that's best done with a Reverse proxy like Squid / Varnish or would you choose memcache(d)?

like image 684
Ward Bekker Avatar asked Sep 03 '10 13:09

Ward Bekker


People also ask

What is reverse proxy cache?

Reverse proxy Cache stores copies of data or files in a temporary storage location so they can be accessed faster. It temporarily saves data for applications, servers, and web browsers, which ensures users need not download information every time they access a website or application.

How do I REST API cache?

Caching in REST APIs POST requests are not cacheable by default but can be made cacheable if either an Expires header or a Cache-Control header with a directive, to explicitly allows caching, is added to the response. Responses to PUT and DELETE requests are not cacheable at all.

When would you use caching in API?

If some recurring requests produce the same response, we can use a cached version of the response to avoid the excessive load.

Does a proxy cache?

Proxy caching is a feature of proxy servers that stores content on the proxy server itself, allowing web services to share those resources to more users. The proxy server coordinates with the source server to cache documents such as files, images and web pages.


2 Answers

Using a reverse proxy that sits on the HTTP layer is more transparent. That means that it's possible to see what's going on over the wire. The bad thing is that few of these support caching authenticated responses, so their efficiency may drop to 0 if your resources require authentication. Reverse proxies also don't usually automatically expire resource A (/foo) when this completely unrelated resource B (/foo/1) is modified. That's correct behaviour that you'd have to add to your solution somehow.

Both of these problems can be solved if you use memcached, since it doesn't have the transparency requirement.

like image 187
mogsie Avatar answered Nov 25 '22 15:11

mogsie


I would go for a reverse proxy like varnish because you can implement (and test) your service without involving cache logic, and add caching as a separate layer. You can upgrade/restart your service while varnish serves old results for GET request (great for availability), and it's easy to setup rules in varnish to invalide (purge) existing cache results based on specific GET/POST actions.

like image 37
ivy Avatar answered Nov 25 '22 14:11

ivy