I'm writing a script to copy compiled files from one location to another.
What I have at the moment is something like this:
import os
import shutil
shutil.copy2 (src, dst)
#... many more shutil.copy commands
#src is a filename string
#dst is the directory where the file is to be copied
My problem is that many of the files being copied are large files, and not all of them are re-compiled in every compile cycle. Ideally, I would like to copy only the changed files in this script. Is there any way I can do this?
You can copy the contents of one folder to another using the shutil. copy(), shutil. copy2() and shutil. copytree() methods of this module.
The copy2() method in Python is used to copy the content of the source file to the destination file or directory. This method is identical to shutil. copy() method also preserving the file's metadata.
The shutil. copy2() method is identical to shutil. copy() except that copy2() attempts to preserve file metadata as well.
You could make use of the file modification time, if that's enough for you:
# If more than 1 second difference
if os.stat(src).st_mtime - os.stat(dest).st_mtime > 1:
shutil.copy2 (src, dst)
Or call a synchronization tool like rsync.
You can do a smart copy using distutils.file_util.copy_file by setting the optional argument: update=1
.
There is also a version that copies entire directories with distutils.dir_util.copy_tree
.
You can then verify that either of those is actually working and only copying required files by enabling logging:
import distutils.log
import distutils.dir_util
distutils.log.set_verbosity(distutils.log.DEBUG)
distutils.dir_util.copy_tree(
src_dir,
dst_dir,
update=1,
verbose=1,
)
which prints which files were copied.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With