I'm updating some code that previously ran on Fabric v1 and worked fine. However now I'm encountering a problem when trying to transfer a file ~200MB in size from local to remote (using, in fabric 2.5.0, connection.put()
). It appears to transfer, but further attempts to manipulate the file show that only 7 to 10 MB successfully transferred, and my task fails.
I have tried a number of steps to isolate the problem. I can manually transfer the file from one host to another with no issues. I can create a simple script as follows that works, also:
import subprocess
ret = subprocess.Popen(['scp', '/tmp/filename', 'host:/tmp/']).wait()
print(ret)
but even attempting this exact same code snippet inside my fabfile.py
results in the same behavior as using connection.put()
-- that is, there is no error message, return code is 0, but the result file on the remote host is 5-10MB and corrupted.
I'm running the task as: fab -d deploy --target=stage --prompt-for-passphrase
and the debug output (though not all 100% understandable to me) does not immediately seem to have anything relevant to the problem either. Where can I look to debug this and find a working solution?
EDIT: relevant version info:
Just an idea, sometimes it happens with everybody.
Have you checked your destination server FREE SPACE
in the destination directory volume? Maybe it is pretty full of logs or etc and you can put there only a few megs?
I can't simulate the case locally but I believe since you can't copy it using scp and Paramiko, that could be a real story :)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With