I have an expensive (time-consuming) external request to another web service I need to make, and I'd like to cache it. So I attempted to use this idiom, by putting the following in the application controller:
def get_listings
cache(:get_listings!)
end
def get_listings!
return Hpricot.XML(open(xml_feed))
end
When I call get_listings!
in my controller everything is cool, but when I call get_listings
Rails complains that no block was given. And when I look up that method I see that it does indeed expect a block, and additionally it looks like that method is only for use in views? So I'm guessing that although it wasn't stated, that the example is just pseudocode.
So my question is, how do I cache something like this? I tried various other ways but couldn't figure it out. Thanks!
In order to use page and action caching you will need to add actionpack-page_caching and actionpack-action_caching to your Gemfile . By default, caching is only enabled in your production environment. You can play around with caching locally by running rails dev:cache , or by setting config. action_controller.
Rails provides an SQL query cache which is used to cache the results of database queries for the duration of a request. When Rails encounters the same query again within the same request, it uses the cached result set instead of running the query against the database again.
Rails (as of 2.1) provides different stores for the cached data created by action and fragment caches. Page caches are always stored on disk. Rails 2.1 and above provide ActiveSupport::Cache::Store which can be used to cache strings.
clear will clear your app cache. In that case rake tmp:cache:clear will just try to remove files from "#{Rails. root}/tmp/cache" but probably won't actually do anything since nothing is probably being cached on the filesystem.
an in-code approach could look something like this:
def get_listings
@listings ||= get_listings!
end
def get_listings!
Hpricot.XML(open(xml_feed))
end
which will cache the result on a per-request basis (new controller instance per request), though you may like to look at the 'memoize' helpers as an api option.
If you want to share across requests don't save data on the class objects, as your app will not be threadsafe, unless you're good at concurrent programming & make sure the threads don't interfere with each other's data access to the shared variable.
The "rails way" to cache across requests is the Rails.cache store. Memcached gets used a lot, but you might find the file or memory stores fit your needs. It really depends on how you're deploying and whether you want to prioritise cache hits, response time, storage (RAM), or use a hosted solution e.g. a heroku addon.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With