Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

cronjob to remove files older than 99 days

I have to make a cronjob to remove files older than 99 days in a particular directory but I'm not sure the file names were made by trustworthy Linux users. I must expect special characters, spaces, slash characters, and others.

Here is what I think could work:

find /path/to/files -mtime +99 -exec rm  {}\;

But I suspect this will fail if there are special characters or if it finds a file that's read-only, (cron may not be run with superuser privileges). I need it to go on if it meets such files.

like image 465
user1469220 Avatar asked Dec 08 '22 23:12

user1469220


2 Answers

When you use -exec rm {} \;, you shouldn't have any problems with spaces, tabs, returns, or special characters because find calls the rm command directly and passes it the name of each file one at a time.

Directories won't' be removed with that command because you aren't passing it the -r parameter, and you probably don't want too. That could end up being a bit dangerous. You might also want to include the -f parameter to do a force in case you don't have write permission. Run the cron script as root, and you should be fine.

The only thing I'd worry about is that you might end up hitting a file that you don't want to remove, but has not been modified in the past 100 days. For example, the password to stop the autodestruct sequence at your work. Chances are that file hasn't been modified in the past 100 days, but once that autodestruct sequence starts, you wouldn't want the one to be blamed because the password was lost.

Okay, more reasonable might be applications that are used but rarely modified. Maybe someone's resume that hasn't been updated because they are holding a current job, etc.

So, be careful with your assumptions. Just because a file hasn't been modified in 100 days doesn't mean it isn't used. A better criteria (although still questionable) is whether the file has been accessed in the last 100 days. Maybe this as a final command:

 find /path/to/files -atime +99 -type f -exec rm -f {}\;

One more thing...

Some find commands have a -delete parameter which can be used instead of the -exec rm parameter:

 find /path/to/files -atime +99 -delete

That will delete both found directories and files.

One more small recommendation: For the first week, save the files found in a log file instead of removing them, and then examine the log file. This way, you make sure that you're not deleting something important. Once you're happy thet there's nothing in the log file you don't want to touch, you can remove those files. After a week, and you're satisfied that you're not going to delete anything important, you can revert the find command to do the delete for you.

like image 79
David W. Avatar answered Dec 17 '22 23:12

David W.


If you run rm with the -f option, your file is going to be deleted regardless of whether you have write permission on the file or not (all that matters is the containing folder). So, either you can erase all the files in the folder, or none. Add also -r if you want to erase subfolders.

And I have to say it: be very careful! You're playing with fire ;) I suggest you debug with something less harmful likfe the file command.

You can test this out by creating a bunch of files like, e.g.:

touch {a,b,c,d,e,f}

And setting permissions as desired on each of them

like image 21
Miquel Avatar answered Dec 18 '22 00:12

Miquel