Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

1 million or more files in one folder, for include (cache) [closed]

I have a 'theoretical' question, to see if a solution I'm planing makes sense or not:

I have a script that reads a lot of data out from the Database, with settings, configuration, etc - and builds that togheter (for every registered user). I wont go into too much details why or what exactly.

My Idea was, that I could actually do that only once and create a .inc file, with the ID of the user, to cache it. If the user changes something, the file will be recreated of course.

But now, lets suppose I do that, with 1'000'000 - or even more files. Will I encounter issues, while including those files? (always one specific file, not every file at once). Is that generaly a good idea, or am I just stressing the server even more with this?

And I'm planing to put everything in the same cache folder - will I have performance improvements, if I split that folder up into multiple ones?

Thanks for the help.

like image 990
Katai Avatar asked Sep 12 '25 09:09

Katai


1 Answers

You will be limited by the file system. It's not possible to reach that many files in a folder. You can do something like this:

  1. Hash the filename: file1.php becomes 3305d5836bea089f2a1758d8e89848c8
  2. Split the hash in several parts: 3/3/0/5/d/5836bea089f2a1758d8e89848c8
  3. It's done
like image 83
Florent Avatar answered Sep 14 '25 00:09

Florent