Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Disadvantages to rack-cache vs. Varnish in Heroku cedar stack?

The previous 2 Heroku application stacks came with a Varnish layer which automatically reverse-proxy-cached content based on http headers.

The new Heroku cedar stack doesn't have this Varnish layer. Heroku suggests using rack-cache and memcache instead.

Does this have disadvantages compared to the previous stacks with the varnish layer? With rack-cache, aren't there fewer servers serving the caching layer, and in a less optimized way?

  • http://devcenter.heroku.com/articles/http-routing
  • http://devcenter.heroku.com/articles/http-caching
  • http://devcenter.heroku.com/articles/cedar
like image 918
John Bachir Avatar asked Oct 28 '11 16:10

John Bachir


People also ask

Is Varnish Cache good?

Varnish is tremendously fast and relies on pthreads to handle a massive amount of incoming requests. The threading model and the use of memory for storage will result in a significant performance boost of your application. If configured correctly, Varnish Cache can easily make your website 1,000 times faster.

What is the use of Varnish Cache?

Varnish Cache is a powerful, open source HTTP engine/reverse HTTP proxy that can speed up a website by up to 1000 percent by doing exactly what its name implies: caching (or storing) a copy of a webpage the first time a user visits.

Is varnish reverse proxy?

Varnish is a caching HTTP reverse proxy. It receives requests from clients and tries to answer them from the cache. If Varnish cannot answer the request from the cache it will forward the request to the backend, fetch the response, store it in the cache and deliver it to the client.

Does Varnish cache files?

Varnish Cache is a popular tool due to how quickly it delivers content from the cache and how flexible it can be. Using Varnish Cache's domain-specific language, Varnish Cache Configuration Language (VCL), users can cache both static and so-called “dynamic” content, also known as the HTML document.


3 Answers

There is absolutely no question that removing the varnish layer is a huge downgrade in cache performance -- both latency and throughput -- from Heroku's bamboo to cedar stack. For one, your application requests are competing with, and may be queued behind, cache hits for dyno time.

The disadvantages, to name a few, are: interpreted ruby (vs. compiled C) application level (vs. proxy level) memcached based (vs. in process memory based) blocking I/O based (vs. nonblocking I/O based). Any suggestion that the two caching strategies could compete at scale is ridiculous. You won't see much difference on a small scale. But if you have hundreds or even thousands of cache hits per second, varnish will not substantially degrade, while the cedar stack would require many dynos just to serve static assets in a performant fashion. (I'm seeing ~ 5-10ms dyno processing time for cache hits on cedar).

Cedar was built the way it was built not to improve performance, but to introduce new levels of flexibility over your application. While it allows you to do non blocking I/O operations like long polling, and fine grained control over static asset caching parameters, it's clear that heroku would like you to bring your own caching/bandwidth if your goal is internet scale.

like image 59
Johnny C Avatar answered Sep 30 '22 17:09

Johnny C


I don't know how the rack-cache performance compares to Varnish in terms of raw requests. The best bet would be to create a simple app benchmark it and switch stacks.

It's worth bearing in mind that because the heroku.com stack was Nginx->Varnish->App as long as you've set the correct HTTP headers you're App layer will be doing much less work. As most of the delivery will be handled by Varnish, as well as Varnish being pretty fast, this frees up your Dyno for actual app-handling-requests.

With the herokuapp.com stack hitting your app earlier, it's down to you and your app to handling caching efficiently, this might mean you choose to use rack-cache to cache full page output or you might choose to use memcached to deal with database requests or both.

In the end it depends on what your app is doing, if it's serving the same static content to a lot of users then you'd benefit from Varnish, but if you've got an application where users login and interact with content then you won't see might benefit from Varnish so caching partial content or raw database queries might be more efficient. If you install the New Relic addon you'll be able to take a peek under the hood and see what's slowing your app down.

like image 35
Tom Avatar answered Sep 30 '22 18:09

Tom


There are also 3rd party options for hosted Varnish. I wrote a quick post about setting it up with Fastly/Varnish.

Fastly is a hosted Varnish solution that will sit in front of your Heroku app and serve cached responses.

Update link: https://medium.com/@harlow/scale-rails-with-varnish-http-caching-layer-6c963ad144c6

I've seen really great response times. If you can get a good cache hit-rate with Varnish you should be able to throttle back a good percentage of your dyno's.

like image 35
harlow Avatar answered Sep 30 '22 16:09

harlow