I'm somewhat new to python and have been trying to figure this out on my own but only getting bits and pieces so far. Basically i'm looking for a script that will recursively search a directory and it's sub-directories and delete files that are at least 24 hours old but not alter the directories. Any advice or examples are greatly appreciated.
This uses the os.walk method to recursively search a directory. For each file, it checks the modified date with os.path.getmtime and compares that with datetime.now (the current time). datetime.timedelta is constructed to create a timedelta of 24 hours.
It searches the directory os.path.curdir which is the current directory when the script is invoked. You can set dir_to_search
to something else, e.g. a parameter to the script.
import os
import datetime
dir_to_search = os.path.curdir
for dirpath, dirnames, filenames in os.walk(dir_to_search):
for file in filenames:
curpath = os.path.join(dirpath, file)
file_modified = datetime.datetime.fromtimestamp(os.path.getmtime(curpath))
if datetime.datetime.now() - file_modified > datetime.timedelta(hours=24):
os.remove(curpath)
If you need it to check all files in all directories recursively, something like this ought to do:
import os, time
path = "/path/to/folder"
def flushdir(dir):
now = time.time()
for f in os.listdir(dir):
fullpath = os.path.join(dir, f)
if os.stat(fullpath).st_mtime < (now - 86400):
if os.path.isfile(fullpath):
os.remove(fullpath)
elif os.path.isdir(fullpath):
flushdir(fullpath)
flushdir(path)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With