The DB load on my site is getting really high so it is time for me to cache common queries that are being called 1000s of times an hour where the results are not changing. So for instance on my city model I do the following:
def self.fetch(id) Rails.cache.fetch("city_#{id}") { City.find(id) } end def after_save Rails.cache.delete("city_#{self.id}") end def after_destroy Rails.cache.delete("city_#{self.id}") end
So now when I can City.find(1) the first time I hit the DB but the next 1000 times I get the result from memory. Great. But most of the calls to city are not City.find(1) but @user.city.name where Rails does not use the fetch but queries the DB again... which makes sense but not exactly what I want it to do.
I can do City.find(@user.city_id) but that is ugly.
So my question to you guys. What are the smart people doing? What is the right way to do this?
Rails provides an SQL query cache which is used to cache the results of database queries for the duration of a request. When Rails encounters the same query again within the same request, it uses the cached result set instead of running the query against the database again.
In order to use page and action caching you will need to add actionpack-page_caching and actionpack-action_caching to your Gemfile . By default, caching is only enabled in your production environment. You can play around with caching locally by running rails dev:cache , or by setting config. action_controller.
ActiveRecord makes accessing your database easy, but it can also help make it faster by its intelligent use of caching.
To use Redis as a Rails cache store, use a dedicated Redis cache that's set up as a LRU (Last Recently Used) cache instead of pointing the store at your existing Redis server, to make sure entries are dropped from the store when it reaches its maximum size.
With respect to the caching, a couple of minor points:
It's worth using slash for separation of object type and id, which is rails convention. Even better, ActiveRecord models provide the cacke_key instance method which will provide a unique identifier of table name and id, "cities/13" etc.
One minor correction to your after_save filter. Since you have the data on hand, you might as well write it back to the cache as opposed to delete it. That's saving you a single trip to the database ;)
def after_save Rails.cache.write(cache_key,self) end
As to the root of the question, if you're continuously pulling @user.city.name, there are two real choices:
-or-
Personal opinion: Implement basic id based fetch methods (or use a plugin) to integrate with memcached, and denormalize the city name to the user's row.
I'm personally not a huge fan of cached model style plugins, I've never seen one that's saved a significant amount of development time that I haven't grown out of in a hurry.
If you're getting way too many database queries it's definitely worth checking out eager loading (through :include) if you haven't already. That should be the first step for reducing the quantity of database queries.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With