Everyday user-uploaded files older than a month are deleted from the server. User uploads are stored into directories by the day (eg /var/www/media/2013-03-13
) so its easy to identify the files/directory that needse to be deleted.
Problem: Deleting 100,000 files at a time makes the server unresponsive and takes a long time. (Ubuntu 12.04 with 2x2TB ext4 SATA3 hdd in software RAID1). At the moment PHP is doing exec
on the command find /path/to/dir -maxdepth 1 -name '*' -delete
.
How do I split up the files required for deletion? Doing a ls
will take really long on those large directories.
Solution need not be in PHP. It does not even require splitting the files into smaller batches
rm -rf
the directories you get from 3.I'm not sure if this is faster than your method, but it avoids explicitly listing all the files in the directories.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With