I’m writing a Python backup script and I need to find the oldest file in a directory (and its sub-directories). I also need to filter it down to *.avi files only.
The script will always be running on a Linux machine. Is there some way to do it in Python or would running some shell commands be better?
At the moment I’m running df to get the free space on a particular partition, and if there is less than 5 gigabytes free, I want to start deleting the oldest *.avi files until that condition is met.
Hm. Nadia’s answer is closer to what you meant to ask; however, for finding the (single) oldest file in a tree, try this:
With a little modification, you can get the
noldest files (similar to Nadia’s answer):Note that using the
.endswithmethod allows calls as:to select more than one extension.
Finally, should you want the complete list of files, ordered by modification time, in order to delete as many as required to free space, here’s some code:
and note that the
reverse=Truebrings the oldest files at the end of the list, so that for the next file to delete, you just do afile_list.pop().By the way, for a complete solution to your issue, since you are running on Linux, where the
os.statvfsis available, you can do:statvfs.f_bfreeare the device free blocks andstatvfs.f_bsizeis the block size. We take therootfolderstatvfs, so mind any symbolic links pointing to other devices, where we could delete many files without actually freeing up space in this device.UPDATE (copying a comment by Juan):
Depending on the OS and filesystem implementation, you may want to multiply f_bfree by f_frsize rather than f_bsize. In some implementations, the latter is the preferred I/O request size. For example, on a FreeBSD 9 system I just tested, f_frsize was 4096 and f_bsize was 16384. POSIX says the block count fields are “in units of f_frsize” ( see http://pubs.opengroup.org/onlinepubs/9699919799//basedefs/sys_statvfs.h.html )