Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to use Varnish to cache RESTful API, but still use HMAC for signing/verifying each request?

I am interested in using Varnish to cache/throttle/etc responses to a RESTful API I am creating. I may be using the term/acronym "HMAC" too loosely, but what I mean is that each request to my API should include a header that includes a hash that was calculated by the client by hashing parts of the request (including a timestamp) with a shared secret. The server then calculates this same hash with the same ingredients from the request, and determines if the request is valid and should be responded to.

This works well enough, but now I would like to use Varnish to cache my API responses. The nature of HMAC requires that each request calculates the hash to verify the user is who they are, but the actual response that is returned is the same - so the meat of the API call is very much cacheable.

What I'd like (and I'm assuming this can be achieved, I just don't know HOW) is to pass the authentication task to the backend, somehow tell Varnish "yes, go ahead and respond to this request" or "no, don't respond to this request" and then from there let Varnish determine if the request can be served from cache or not.

Even more ideally, would be to do something slightly fancier, and allow Varnish to handle the authentication itself, or pass the HMAC processing onto something faster then the backend. For example, the API might store the client secret/public key in a redis cache, then Varnish might actually calculate the hash itself using the values from Redis.

like image 689
Kevin Mitchell Avatar asked Feb 19 '14 20:02

Kevin Mitchell


People also ask

How do you know if Varnish is caching?

To verify that Varnish is proxying look for the existence of the X-Varnish header in the response. The Age header will be 0 on a cache miss and above zero on a hit. The first request to a page will always be a miss.

How does Varnish caching work?

Varnish cache is a web application accelerator also known as caching HTTP reverse proxy. It acts more like a middle man between your client (i.e. user) and your web server. That means, instead of your web server to directly listen to requests of specific contents all the time, Varnish will assume the responsibility.

Is Varnish Cache good?

You can use Varnish to cache both dynamic and static content: this is an efficient solution to increase not only your website speed but also your server performance. According to its developers: “It can speed up delivery with a factor of 300 – 1000x, depending on your architecture.


1 Answers

You should be able to implement the fancier solution in Varnish VCL code (Varnish Configuration Language) by using two Varnish Modules:

  • Redis vmod to fetch keys.
  • Varnish Digest Module for calculating/processing HMAC.

Both modules are used in production, as listed in the modules directory.

If Varnish handles the authentication in VCL, you can let Varnish cache your API backend response and deliver it only for authenticated requests.

If the HMAC implementation requires the request body:

As Gridfire points out in his/her answer, Varnish cannot access the request body. And we can/should not send the full request body in a HTTP header from the backend/application.

But, we can send a hash/digest of the full request body in a HTTP header. Calculation of the hash on the backend should be negligible compared to generating the output(markup|data|whatever). AFAICT there should be no cryptological/practical downsides to this method as long as the hash/digest and HMAC is robust, and the digest is lengthy (256bits or more). Performance testing is adviced as usual.

like image 119
Geir Bostad Avatar answered Sep 28 '22 06:09

Geir Bostad