Uploading large amounts of files to an FTP server. In the middle of my upload the server times out preventing me from uploading any further. Does anyone know of a way to detect if the server has timed out, reconnect and continue transmission of the data? I am using the Python ftp library for the transmission.
Thanks
You can simply specify a timeout for the connect, but for timeouts during file transfer or other operations it's not so simple.
Because the storbinary and retrbinary methods allow you to provide a callback, you can implement a watchdog timer. Each time you get data you reset the timer. If you don't get data at least every 30 seconds (or whatever) the watchdog will attempt to abort and close the FTP session and send an event back to your event loop (or whatever).
ftpc = FTP(myhost, 'ftp', 30)
def timeout():
ftpc.abort() # may not work according to docs
ftpc.close()
eventq.put('Abort event') # or whatever
timerthread = [threading.Timer(30, timeout)]
def callback(data, *args, **kwargs):
eventq.put(('Got data', data)) # or whatever
if timerthread[0] is not None:
timerthread[0].cancel()
timerthread[0] = threading.Timer(30, timeout)
timerthread[0].start()
timerthread[0].start()
ftpc.retrbinary('RETR %s' % (somefile,), callback)
timerthread[0].cancel()
If this isn't good enough, it appears you will have to choose a different API. The Twisted framework has FTP protocol support that should allow you to add timeout logic.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With