Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

C# Getting a complete file list using ftpRequest is slow

I want to get the filename, file size and last modified time of each file on an FTP server, then populate a listView with it.

It worked really well until I switched FTP host, and now it's really sluggish, despite the new host being just as fast in real FTP clients.

Any apparent reason as to why?

Also, is it really neccessary to send the login credentials each time?

I'm using the first method to get a string array, then iterate through it and use the second one on each item to get the file size:

    public static string[] GetFileList()
    {
        string[] downloadFiles;
        StringBuilder result = new StringBuilder();
        FtpWebRequest request;
        try
        {
            request = (FtpWebRequest)FtpWebRequest.Create(new Uri("ftp://mysite.se/"));
            request.UseBinary = true;
            request.Credentials = new NetworkCredential(settings.Username, settings.Password);
            request.Method = WebRequestMethods.Ftp.ListDirectory;
            request.UseBinary = true;

            WebResponse response = request.GetResponse();
            StreamReader reader = new StreamReader(response.GetResponseStream());

            string line = reader.ReadLine();
            while (line != null)
            {
                result.Append(line);
                result.Append("\n");
                line = reader.ReadLine();
            }
            // to remove the trailing '\n'
            result.Remove(result.ToString().LastIndexOf('\n'), 1);
            reader.Close();
            response.Close();
            return result.ToString().Split('\n');
        }
        catch (Exception ex)
        {
            System.Windows.Forms.MessageBox.Show(ex.Message);
            downloadFiles = null;
            return downloadFiles;
        }
    }

    public static int GetFileSize(string file)
    {
        //MessageBox.Show("getting filesize...");

        StringBuilder result = new StringBuilder();
        FtpWebRequest request;
        try
        {
            request = (FtpWebRequest)FtpWebRequest.Create(new Uri("ftp://mysite.se/" + file));
            request.UseBinary = true;
            request.Credentials = new NetworkCredential(settings.Username, settings.Password);
            request.Method = WebRequestMethods.Ftp.GetFileSize;

            int dataLength = (int)request.GetResponse().ContentLength;

            return dataLength;
        }
        catch (Exception ex)
        {
            //System.Windows.Forms.MessageBox.Show(ex.Message);
            return 1337;
        }
    }
like image 655
pastapockets Avatar asked Dec 17 '22 02:12

pastapockets


1 Answers

The problem is that each GetFileSize call has to reconnect to the server and issue a request for the file size. If you can set things up to use a single, persistent connection then you'll save connection time, but will still spend a lot of time sending a request for each file and waiting for a response.

(Edit: this may already be the case. MSDN says: Multiple FtpWebRequests reuse existing connections, if possible.)

If you use ListDirectoryDetails rather than ListDirectory, then the server will probably send down more information (file size, permissions, etc) along with each file name. This wouldn't take any longer than just doing ListDirectory, and you could pull the name and size out of each line and store the sizes for later.

However, different servers may send down the information in different formats, and some may not send the size info at all, so this may not help if you need your program to reliably use any FTP server.

like image 80
andrewffff Avatar answered Jan 23 '23 06:01

andrewffff