I recently switch from the default Simple I18n backend to a Redis backend for my I18n. I did it so make it easier for us to handle the translations, but I've found that there was a substantial performance hit on every page.
I've run some Benchmarks with Rails 3.2 and Redis 2.6.4 installed on my MBP to demonstrate. I'm using hiredis-rb as my client.
It's a pretty clear difference when exercising the two different backends. With the simple backend there is a short delay on the first call - I assume the translations are being loaded into memory - and then great performance after that:
pry(main)> Benchmark.realtime { 500.times { I18n.t 'shared.slogan' } }
=> 0.143246
pry(main)> Benchmark.realtime { 500.times { I18n.t 'shared.slogan' } }
=> 0.00415
pry(main)> Benchmark.realtime { 500.times { I18n.t 'shared.slogan' } }
=> 0.004153
pry(main)> Benchmark.realtime { 500.times { I18n.t 'shared.slogan' } }
=> 0.004056
The Redis backend is consistently slow:
pry(main)> Benchmark.realtime { 500.times { I18n.t 'shared.slogan' } }
=> 0.122448
pry(main)> Benchmark.realtime { 500.times { I18n.t 'shared.slogan' } }
=> 0.263564
pry(main)> Benchmark.realtime { 500.times { I18n.t 'shared.slogan' } }
=> 0.232637
pry(main)> Benchmark.realtime { 500.times { I18n.t 'shared.slogan' } }
=> 0.122304
It makes absolute sense to me why this is slow for I18n... I'm queueing up dozens of I18n calls throughout my code base. If I could batch them together up front I'd be in good shape:
pry(main)> keys = $redis.keys[0..500]
pry(main)> Benchmark.realtime { $redis.mget keys }
=> 0.04264
But I don't really see a clean way to do this with any of the existing I18n backends. Has anybody out there tackled this problem?
I took Chris Heald's suggestion and created a backend with memoization a simple cache bust. The gist is up here:
https://gist.github.com/wheeyls/5650947
I'll try this out for a few days and then turn it into a gem.
My solution is available as a gem now:
https://github.com/wheeyls/cached_key_value_store
And I also blogged about this problem:
http://about.g2crowd.com/faster-i18nredis-on-rails/
Network traffic will always be slower than local work. You might consider an in-memory cache, and then just pull the current localization version on each request (or even just on a short timer) to determine whether to invalidate the cache. It looks like there's a Memoization module (per the source here) that you can just mix into an I18n interface. Then, we just tweak the #lookup
method so that every 5 minutes, it checks Redis for an updated locale version, and ensures that it increments the locale version when new translations are saved.
This gives you an in-memory cache of all your translations so it's a very fast lookup, while giving you the ability to make translation changes on-the-fly - your translations may take up to 5 minutes to update, but you don't have to do any explicit cache purging.
If you wanted, you could make it do the check on every request with a before_filter
rather than just using the lazy 5-minute expiration, which means more requests to redis, but you wouldn't see any stale translations.
module I18n
module Backend
class CachedKeyValueStore < KeyValue
include Memoize
def store_translations(locale, data, options = {})
@store.incr "locale_version:#{locale}"
reset_memoizations!(locale)
super
end
def lookup(locale, key, scope = nil, options = {})
ensure_freshness(locale)
flat_key = I18n::Backend::Flatten.normalize_flat_keys(locale,
key, scope, options[:separator]).to_sym
flat_hash = memoized_lookup[locale.to_sym]
flat_hash.key?(flat_key) ? flat_hash[flat_key] : (flat_hash[flat_key] = super)
end
def ensure_freshness(locale)
@last_check ||= 0
if @last_check < 5.minutes.ago
@last_check = Time.now
current_version = @store.get "locale_version:#{locale}"
if @last_version != current_version
reset_memoizations! locale
@last_version = current_version
end
end
end
end
end
end
I just hacked this up from reading the I18n source, and I haven't tested it at all, so it might need some work, but I think it communicates the idea well enough.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With