In a fairly big legacy project, I've refactored several hairy modules into Moose classes. Each of these modules requires database access to (lazy) fetch its attributes. Since those objects are used pretty heavily, I want to reduce the number of redundant requests, for example for unchanged data.
Now, how do I do that properly? I've got several alternatives:
memcached
with expiration of 5-10 minutes (probably not too difficult, but tricky with lazy attributes) update: KiokuDB could probably help here, have to read up about attributesDBIx::Class
(needs to be done anyway) and implement caching on this level (DBIC will probably take most of the pain away just by itself)How would you do this and what do you consider a sane way? Is caching data preferred on object or the ORM level?
The short answer to #3 is: Don't use 'my'. You might do something like:
use vars qw($object);
# OR post perl5.6:
# our ($object);
# create your object if it doesn't already exist
$object ||= create_object;
# Maybe reload some attributes if they have expired.
$object->check_expires;
Objects created like this inside your handler will only be shared inside each Apache child, which is fine if you are reloading the data every 5-10 minutes. Any modules and objects that are read-only should be loaded in a PerlPostConfigRequire script so that they will be shared across all children.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With