I'm writing a bash script that needs to delete old files.
It's currently implemented using :
find $LOCATION -name $REQUIRED_FILES -type f -mtime +1 -delete
This will delete of the files older than 1 day.
However, what if I need a finer resolution that 1 day, say like 6 hours old? Is there a nice clean way to do it, like there is using find and -mtime?
To delete files older than 10 days in Windows 11 or Windows 10, you can use the ForFiles command. First, open the Command Prompt with administrator rights. Then, enter this command: ForFiles /p “folder-path” /s /d -10 /c “cmd /c del /q @file”. It will remove all the files older than 10 days only.
Setting a file to auto-deletebutton for the file and select More Actions>Set Expiration. Check off the box to Auto-delete this item on a selected date and use the box to select the appropriate date for deletion. Click Save to save your changes.
-mmin +<minutes> : The mmin option is used to find files/directories with last modified in minutes (replace minutes with the amount of minutes as an integer). In our case, we want to search files older than 10 minutes. If you need to search for files older than 20 minutes, you would simply use -mmin +20 .
Does your find
have the -mmin
option? That can let you test the number of mins since last modification:
find $LOCATION -name $REQUIRED_FILES -type f -mmin +360 -delete
Or maybe look at using tmpwatch
to do the same job. phjr also recommended tmpreaper
in the comments.
Here is the approach that worked for me (and I don't see it being used above)
$ find /path/to/the/folder -name '*.*' -mmin +59 -delete > /dev/null
deleting all the files older than 59 minutes while leaving the folders intact.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With