I have a 'theoretical' question, to see if a solution I'm planing makes sense or not:
I have a script that reads a lot of data out from the Database, with settings, configuration, etc - and builds that togheter (for every registered user). I wont go into too much details why or what exactly.
My Idea was, that I could actually do that only once and create a .inc
file, with the ID of the user, to cache it. If the user changes something, the file will be recreated of course.
But now, lets suppose I do that, with 1'000'000 - or even more files. Will I encounter issues, while including those files? (always one specific file, not every file at once). Is that generaly a good idea, or am I just stressing the server even more with this?
And I'm planing to put everything in the same cache folder - will I have performance improvements, if I split that folder up into multiple ones?
Thanks for the help.
You will be limited by the file system. It's not possible to reach that many files in a folder. You can do something like this:
file1.php
becomes 3305d5836bea089f2a1758d8e89848c8
3/3/0/5/d/5836bea089f2a1758d8e89848c8
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With