Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

PHP Caching, Storing cache in multiple directories

Tags:

php

We have a number of client sites still at GoDaddy, and they're complicated sites, so it will take some time for us to get around to migrating them. Ultimately, they'll end up in a dedicated environment, but for now, we're stuck with GoDaddy's shared hosting scenario.

I want to setup output buffer PHP caching in static files, and have it clear maybe every 3-4 hours. What I've seen online for accomplishing this seems to throw all the cache files into a single big directory.

GoDaddy has a policy (something we're battling now on another issue) where they don't allow more than 1,024 files in a single directory, because it degrades performance for the shared environment. I don't dispute that fact, but I'm hoping there may be a clever solution for PHP caching where we have a nested caching structure of sorts.

Any ideas?

like image 538
Will Ashworth Avatar asked Dec 20 '25 18:12

Will Ashworth


1 Answers

What you can do is hash the directories to create a sub-directory structure. A lot of it is going to depend on how you are storing/naming the files you're caching of course, so this is just a general approach.

For example, if you are caching page 20150.php, store the file as /cache/20/20150.php, limiting each 2nd level directory to 1000 files. If you know you have 20150.php to load, then you know to use the thousands digits as the directory name.

This is how very large quantities of images and documents are often stored in file systems by document management systems. This is ideal if the content you are caching has associated article IDs in a database.

Another way is to create an actual hash of the page (md5, or the algorithm of your choice), and base the structure on that, with the top directory being the first 3 digits of the hash, the one below it the second 3. So for example, file c88ad6a7421ef2b0d7451c3390a00a39.php might be stored in /cache/c88/ad6/c88ad6a7421ef2b0d7451c3390a00a39.php.

The other thing to do, or course, is to make sure your db is as optimized as it can get so that you know you actually need to cache content. Caching is fine, but it should be done once you've squeezed everything you can out of the db (indexes, optimized queries).

There's also servers such as memcached, but being on shared hosting I'm assuming thats not available to you.


Donate For Us

If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!