I'm developing a bash script that needs to search out files within a single directory that are "old" based off a variable that specifies how many days need to pass before the threshold is exceeded and the files are marked for action (could be anything from move to archive to delete, etc...).
The catch is that the modify time of the file is irrelevant in determining how old the files need to be before taken action upon, as the files may infrequently be changed, the execution time of the script can vary, etc...
The time that determines hold the files are is in the actual file name in the form of YYYY-MM-DD (or %F with the date command). take for instance the filename contents-2011-05-23.txt. What command(s) could be run in this directory to find all files that exceed a certain amount of days (I have the threshold currently set to 7 days, could change) and print out their file names?
Create a bash script isOld.sh like this:
#!/bin/bash
fileName=$1
numDays=$2
fileDt=$(echo $fileName | sed 's/^[^-]*-\([^.]*\)\..*$/\1/')
d1=$(date '+%s')
d2=$(date -d $fileDt '+%s')
diff=$((d1-d2))
seconds=$((numDays * 24 * 60 * 60))
[[ diff -ge seconds ]] && echo $fileName
Then give execute permission to above file by running:
chmod +x ./isOld.sh
And finally run this find command from top of your directory to print files older than 7 days as:
find . -name "contents-*" -exec ./isOld.sh {} 7 \;
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With