Lets say you have a fragment of the page which displays the most recent posts, and you expire it in 30 minutes. I'm using Rails here.
<% cache("recent_posts", :expires_in => 30.minutes) do %>
...
<% end %>
Obviously you don't need to do the database lookup to get the most recent posts if the fragment exists, so you should be able to avoid that overhead too.
What I'm doing now is something like this in the controller which seems to work:
unless Rails.cache.exist? "views/recent_posts"
@posts = Post.find(:all, :limit=>20, :order=>"updated_at DESC")
end
Is this the best way? Is it safe?
One thing I don't understand is why the key is "recent_posts
" for the fragment and "views/recent_posts
" when checking later, but I came up with this after watching memcached -vv
to see what it was using. Also, I don't like the duplication of manually entering "recent_posts
", it would be better to keep that in one place.
Ideas?
Page caches are always stored on disk. Rails 2.1 and above provide ActiveSupport::Cache::Store which can be used to cache strings. Some cache store implementations, like MemoryStore, are able to cache arbitrary Ruby objects, but don't count on every cache store to be able to do that.
Cache sweeping is a mechanism which allows you to get around having a ton of expire_{page,action,fragment} calls in your code. It does this by moving all the work required to expire cached content into na ActionController::Caching::Sweeper class.
Memcache caches objects in RAM to speed up access to the content. Content is directly fetched from memory instead of from an external data source. One of the most popular memcache servers is provided by Danga, an open-source software.
By default Rails provides fragment caching. In order to use page and action caching you will need to add actionpack-page_caching and actionpack-action_caching to your Gemfile . By default, caching is only enabled in your production environment.
Evan Weaver's Interlock Plugin solves this problem.
You can also implement something like this yourself easily if you need different behavior, such as more fine grained control. The basic idea is to wrap your controller code in a block that is only actually executed if the view needs that data:
# in FooController#show
@foo_finder = lambda{ Foo.find_slow_stuff }
# in foo/show.html.erb
cache 'foo_slow_stuff' do
@foo_finder.call.each do
...
end
end
If you're familiar with the basics of ruby meta programming it's easy enough to wrap this up in a cleaner API of your taste.
This is superior to putting the finder code directly in the view:
I think cache_fu might have similar functionality in one of it's versions/forks, but can't recall specifically.
The advantage you get from memcached is directly related to your cache hit rate. Take care not to waste your cache capacity and cause unnecessary misses by caching the same content multiple times. For example, don't cache a set of record objects as well as their html fragment at the same time. Generally fragment caching will offer the best performance, but it really depends on the specifics of your application.
What happens if the cache expires between the time you check for it in the controller and the time it's beeing checked in the view rendering?
I'd make a new method in the model:
class Post
def self.recent(count)
find(:all, :limit=> count, :order=>"updated_at DESC")
end
end
then use that in the view:
<% cache("recent_posts", :expires_in => 30.minutes) do %>
<% Post.recent(20).each do |post| %>
...
<% end %>
<% end %>
For clarity, you could also consider moving the rendering of a recent post into its own partial:
<% cache("recent_posts", :expires_in => 30.minutes) do %>
<%= render :partial => "recent_post", :collection => Post.recent(20) %>
<% end %>
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With