I am trying to download files from ftp server with this code:
using (System.IO.FileStream fileStream = System.IO.File.OpenWrite(filePath))
{
byte[] buffer = new byte[4096];
int bytesRead = responseStream.Read(buffer, 0, 4096);
while (bytesRead > 0)
{
fileStream.Write(buffer, 0, bytesRead);
bytesRead = responseStream.Read(buffer, 0, 4096);
}
}
The creation of responseStream:
System.IO.Stream responseStream = GetFileAsStream(url, username, password, false);
public static System.IO.Stream GetFileAsStream(string ftpUrl, string username, string password, bool usePassive)
{
System.Net.FtpWebRequest request = (System.Net.FtpWebRequest)System.Net.WebRequest.Create(ftpUrl);
request.KeepAlive = false;
request.ReadWriteTimeout = 120000;
request.Timeout = -1;
request.UsePassive = usePassive;
request.Credentials = new System.Net.NetworkCredential(username, password);
request.Method = System.Net.WebRequestMethods.Ftp.DownloadFile;
System.IO.Stream fileResponseStream;
System.Net.FtpWebResponse fileResponse = (System.Net.FtpWebResponse)request.GetResponse();
fileResponseStream = fileResponse.GetResponseStream();
return fileResponseStream;
}
It works fine with smaller files but when a file is bigger (e.g. 150mb) the process hangs. For some reason the program does not understand that it has completed the download and it still tries to read more bytes.
I prefer answers which do not include using external libraries. Thank you
Credentials = new NetworkCredential(_remoteUser, _remotePass); FtpWebResponse response = (FtpWebResponse)request. GetResponse(); Stream responseStream = response. GetResponseStream(); StreamReader reader = new StreamReader(responseStream); StreamWriter writer = new StreamWriter(destination); writer.
It allows the web pages to transfer to the server so that others can access them. Using an FTP client we can upload, download, delete, move, rename and copy the file on a server. If you send your file through FTP then your files mainly perform upload or download from the FTP server.
To download multiple files or folders, click/tap+drag or CTRL/CMD+click the files/folders, then click the icon in the footer toolbar. The files/folders will be zipped on the server and your browser will prompt you to open or save the file to your device.
I have solved my problem by introducing a request timeout- which, if reached, makes the program to throw a WebException. In that case, the program resumes the download from the place it left of. Here's my code :
This is inside of a method which is returning true if the file is downloaded, false- otherwise
Digitalez.DirectoryUtil.EnsureDirectoryExists(relativePath);
string filePath = System.IO.Path.Combine(relativePath, fileInfo.Name);
long length = Digitalez.FtpUtil.GetFileLength(fileInfo.FullPath, userName, password, usePassive);
long offset = 0;
int retryCount = 10;
int? readTimeout = 5 * 60 * 1000; //five minutes
// if the file exists, do not download it
if (System.IO.File.Exists(filePath))
{
return false;
}
while (retryCount > 0)
{
using (System.IO.Stream responseStream = Captator.Eifos.Net.FtpUtil.GetFileAsStream(fileInfo.FullPath, userName, password, usePassive, offset, requestTimeout: readTimeout != null ? readTimeout.Value : System.Threading.Timeout.Infinite))
{
using (System.IO.FileStream fileStream = new System.IO.FileStream(filePath, System.IO.FileMode.Append))
{
byte[] buffer = new byte[4096];
try
{
int bytesRead = responseStream.Read(buffer, 0, buffer.Length);
while (bytesRead > 0)
{
fileStream.Write(buffer, 0, bytesRead);
bytesRead = responseStream.Read(buffer, 0, buffer.Length);
}
return true;
}
catch (System.Net.WebException)
{
// Do nothing - consume this exception to force a new read of the rest of the file
}
}
if (System.IO.File.Exists(filePath))
{
offset = new System.IO.FileInfo(filePath).Length;
}
else
{
offset = 0;
}
retryCount--;
if (offset == length)
{
return true;
}
}
}
Digitalez.FtpUtil:
public static System.IO.Stream GetFileAsStream(string ftpUrl, string username, string password, bool usePassive, long offset, int requestTimeout)
{
System.Net.FtpWebRequest request = (System.Net.FtpWebRequest)System.Net.WebRequest.Create(ftpUrl);
request.KeepAlive = false;
request.ReadWriteTimeout = requestTimeout;
request.Timeout = requestTimeout;
request.ContentOffset = offset;
request.UsePassive = usePassive;
request.UseBinary = true;
request.Credentials = new System.Net.NetworkCredential(username, password);
request.Method = System.Net.WebRequestMethods.Ftp.DownloadFile;
System.IO.Stream fileResponseStream;
System.Net.FtpWebResponse fileResponse = (System.Net.FtpWebResponse)request.GetResponse();
fileResponseStream = fileResponse.GetResponseStream();
return fileResponseStream;
}
public static long GetFileLength(string ftpUrl, string username, string password, bool usePassive)
{
System.Net.FtpWebRequest request = (System.Net.FtpWebRequest)System.Net.WebRequest.Create(ftpUrl);
request.KeepAlive = false;
request.UsePassive = usePassive;
request.Credentials = new System.Net.NetworkCredential(username, password);
request.Method = System.Net.WebRequestMethods.Ftp.GetFileSize;
System.Net.FtpWebResponse lengthResponse = (System.Net.FtpWebResponse)request.GetResponse();
long length = lengthResponse.ContentLength;
lengthResponse.Close();
return length;
}
I haven't tried other servers but this certainly does the trick.
I have successfully downloaded several files over 150MB using your code. As others have suggested, it is likely a problem with your FTP server.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With