I have created a python script to connect to a remserver.
datfile = []
for dk in range(len(files)):
dfnt=files[dk]
dpst=dfnt.find('.dat')
if dpst == 15:
dlist = dfnt[:]
datfile.append(dlist)
assert datfile == ['a.dat','b.dat']
# True
Which as you can see creates a list. Now I am passing this list to
ftp.retrbinary('datfile')
but this lines returns an error:
typeerror: retrbinary() takes at least 3 arguments (2 given)
not sure what is looking for?
It's telling you that you aren't supplying enough arguments to the retrbinary
method.
The documentation specifies that you must also supply a 'callback' function that gets called for every block of data received. You'll want to write a callback function and do something with the data it gives you (e.g. write it to a file, collect it in memory, etc.)
As a side note, you might ask why it says there are '3' required arguments instead of just '2'. This is because it's also counting the 'self' argument that Python requires on instance methods, but you are implicitly passing that with the ftp
object reference.
EDIT - Looks like I may not have entirely answered your question.
For the command
argument you are supposed to be passing a valid RETR command, not a list.
filenames = ['a.dat', 'b.dat']
# Iterate through all the filenames and retrieve them one at a time
for filename in filenames:
ftp.retrbinary('RETR %s' % filename, callback)
For the callback
, you need to pass something that is callable (usually a function of some sort) that accepts a single argument. The argument is a chunk of data from the file being retrieved. I say a 'chunk' because when you're moving large files around, you rarely want to hold the entire file in memory. The library is designed to invoke your callback iteratively as it receives chunks of data. This allows you to write out chunks of the file so you only have to keep a relatively small amount of data in memory at any given time.
My example here is a bit advanced, but your callback can be a closure inside the for loop that writes to a file which has been opened:
import os
filenames = ['a.dat', 'b.dat']
# Iterate through all the filenames and retrieve them one at a time
for filename in filenames:
local_filename = os.path.join('/tmp', filename)
# Open a local file for writing (binary mode)...
# The 'with' statement ensures that the file will be closed
with open(local_filename, 'wb') as f:
# Define the callback as a closure so it can access the opened
# file in local scope
def callback(data):
f.write(data)
ftp.retrbinary('RETR %s' % filename, callback)
This can also be done more concisely with a lambda
statement, but I find people new to Python and some of its functional-style concepts understand the first example more easily. Nevertheless, here's the ftp call with a lambda instead:
ftp.retrbinary('RETR %s' % filename, lambda data: f.write(data))
I suppose you could even do this, passing the write
instance method of the file directly as your callback:
ftp.retrbinary('RETR %s' % filename, f.write)
All three of these examples should be analogous and hopefully tracing through them will help you to understand what's going on.
I've elided any sort of error handling for the sake of example.
Also, I didn't test any of the above code, so if it doesn't work let me know and I'll see if I can clarify it.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With