I'm writing a Python backup script and I need to find the oldest file in a directory (and its sub-directories). I also need to filter it down to *.avi files only.
The script will always be running on a Linux machine. Is there some way to do it in Python or would running some shell commands be better?
At the moment I'm running df
to get the free space on a particular partition, and if there is less than 5 gigabytes free, I want to start deleting the oldest *.avi
files until that condition is met.
Hm. Nadia's answer is closer to what you meant to ask; however, for finding the (single) oldest file in a tree, try this:
import os
def oldest_file_in_tree(rootfolder, extension=".avi"):
return min(
(os.path.join(dirname, filename)
for dirname, dirnames, filenames in os.walk(rootfolder)
for filename in filenames
if filename.endswith(extension)),
key=lambda fn: os.stat(fn).st_mtime)
With a little modification, you can get the n
oldest files (similar to Nadia's answer):
import os, heapq
def oldest_files_in_tree(rootfolder, count=1, extension=".avi"):
return heapq.nsmallest(count,
(os.path.join(dirname, filename)
for dirname, dirnames, filenames in os.walk(rootfolder)
for filename in filenames
if filename.endswith(extension)),
key=lambda fn: os.stat(fn).st_mtime)
Note that using the .endswith
method allows calls as:
oldest_files_in_tree("/home/user", 20, (".avi", ".mov"))
to select more than one extension.
Finally, should you want the complete list of files, ordered by modification time, in order to delete as many as required to free space, here's some code:
import os
def files_to_delete(rootfolder, extension=".avi"):
return sorted(
(os.path.join(dirname, filename)
for dirname, dirnames, filenames in os.walk(rootfolder)
for filename in filenames
if filename.endswith(extension)),
key=lambda fn: os.stat(fn).st_mtime),
reverse=True)
and note that the reverse=True
brings the oldest files at the end of the list, so that for the next file to delete, you just do a file_list.pop()
.
By the way, for a complete solution to your issue, since you are running on Linux, where the os.statvfs
is available, you can do:
import os
def free_space_up_to(free_bytes_required, rootfolder, extension=".avi"):
file_list= files_to_delete(rootfolder, extension)
while file_list:
statv= os.statvfs(rootfolder)
if statv.f_bfree*statv.f_bsize >= free_bytes_required:
break
os.remove(file_list.pop())
statvfs.f_bfree
are the device free blocks and statvfs.f_bsize
is the block size. We take the rootfolder
statvfs, so mind any symbolic links pointing to other devices, where we could delete many files without actually freeing up space in this device.
UPDATE (copying a comment by Juan):
Depending on the OS and filesystem implementation, you may want to multiply f_bfree by f_frsize rather than f_bsize. In some implementations, the latter is the preferred I/O request size. For example, on a FreeBSD 9 system I just tested, f_frsize was 4096 and f_bsize was 16384. POSIX says the block count fields are "in units of f_frsize" ( see http://pubs.opengroup.org/onlinepubs/9699919799//basedefs/sys_statvfs.h.html )
You can use stat and fnmatch modules together to find the files
ST_MTIME refere to the last modification time. You can choose another value if you want
import os, stat, fnmatch
file_list = []
for filename in os.listdir('.'):
if fnmatch.fnmatch(filename, '*.avi'):
file_list.append((os.stat(filename)[stat.ST_MTIME], filename))
Then you can order the list by time and delete according to it.
file_list.sort(key=lambda a: a[0])
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With