I am trying to download file using FTP, and in between if connection terminates then it should resume from where it was stop. My problem is that using following code snippet I able to continue download if I close connection and then again connect it, but if I am do so at server site then I am not able to resume download, and program goes into infinites state.
#include <stdio.h>
#include <curl/curl.h>
/*
* This is an example showing how to get a single file from an FTP server.
* It delays the actual destination file creation until the first write
* callback so that it won't create an empty file in case the remote file
* doesn't exist or something else fails.
*/
struct FtpFile {
const char *filename;
FILE *stream;
};
static size_t my_fwrite(void *buffer, size_t size, size_t nmemb, void *stream)
{
struct FtpFile *out=(struct FtpFile *)stream;
if(out && !out->stream) {
/* open file for writing */
out->stream=fopen(out->filename, "wb");
if(!out->stream)
return -1; /* failure, can't open file to write */
}
return fwrite(buffer, size, nmemb, out->stream);
}
int main(void)
{
CURL *curl;
CURLcode res;
struct FtpFile ftpfile={
"dev.zip", /* name to store the file as if succesful */
NULL
};
curl_global_init(CURL_GLOBAL_DEFAULT);
curl = curl_easy_init();
if(curl) {
/*
* You better replace the URL with one that works!
*/
curl_easy_setopt(curl, CURLOPT_URL,
"ftp://root:[email protected]/dev.zip");
/* Define our callback to get called when there's data to be written */
curl_easy_setopt(curl, CURLOPT_WRITEFUNCTION, my_fwrite);
/* Set a pointer to our struct to pass to the callback */
curl_easy_setopt(curl, CURLOPT_WRITEDATA, &ftpfile);
/* Switch on full protocol/debug output */
curl_easy_setopt(curl, CURLOPT_VERBOSE, 1L);
res = curl_easy_perform(curl);
/* always cleanup */
curl_easy_cleanup(curl);
if(CURLE_OK != res) {
/* we failed */
fprintf(stderr, "curl told us %d\n", res);
}
}
if(ftpfile.stream)
fclose(ftpfile.stream); /* close the local file */
curl_global_cleanup();
return 0;
}
Could any one tell me that how can I resume download if connection is closed by remote site. Any help would be appreciated
Thanks,
Yuvi
The FTP or FTPS server must provide support for the REST FTP command to resume the transfer of the file. You cannot resume a file transfer (operation) with the SFTP protocol.
DESCRIPTION. curl is a tool to transfer data from or to a server, using one of the supported protocols (HTTP, HTTPS, FTP, FTPS, GOPHER, DICT, TELNET, LDAP or FILE). The command is designed to work without user interaction.
To upload to an FTP server, you specify the entire target file path and name in the URL, and you specify the local file name to upload with -T, --upload-file . Optionally, you end the target URL with a slash and then the file component from the local path will be appended by curl and used as the remote file name.
Curl with FTP. FTP, means “File Transfer Protocol, in short we use FTP, is a standard network protocol that is used to transfer the data from one host to another host over a TCP based network like Internet.
To resume a single file upload using the built-in ftp command you will need to know how many bytes of the file you have already sent. This should be accessible by using ls.
To save the remote file to your local system, with the same filename as the server you’re downloading from, add the --remote-name argument, or use the -O option: Your file will download: Instead of displaying the contents of the file, curl displays a text-based progress meter and saves the file to the same name as the remote file’s name.
I am on a Mac OS X and do now want to install wget command. How can I resume a failed download using curl command on Linux or Unix-like systems? You can continue getting a partially downloaded file using curl command. You need to pass the -C or --continue-at <offset> option resume a previous file transfer at the given offset. [donotprint]
Add a varaible to the ftpfile struct to aware your write function of the need to appeand and tell libcurl to resume the download from the end of the destination file by setting CURLOPT_RESUME_FROM to the number of bytes downloaded already:
struct FtpFile
{
const char *pcInfFil;
FILE *pFd;
int iAppend;
};
In main, if you want to resume:
curl_easy_setopt(curl, CURLOPT_RESUME_FROM , numberOfBytesToSkip);
If you the file doesn't already exist, or it is not a resumed download but a new download, be sure to set CURLOPT_RESUME_FROM back to 0.
In my_fwrite:
out->stream=fopen(out->filename, out->iAppend ? "ab":"wb");
P.S. if where you need to resume the file is larger than a long (2GB), look into CURLOPT_RESUME_FROM_LARGE and CURL_OFF_T_C()
In response to comment requesting additional information on how to know when a transfer failed:
After you call curl easy perform call:
CURLcode curl_easy_getinfo(CURL *curl, CURLINFO info, ... );
Retrieve from the curl context:
CURLINFO_HEADER_SIZE
CURLINFO_CONTENT_LENGTH_DOWNLOAD
Add them together and make sure they equal
CURLINFO_SIZE_DOWNLOAD
If not try to reperform the context.
And be sure to use the latest version of curl, it should timeout after 60 seconds of not hearing from the FTP server it is downloading from.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With