Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Getting files recursively: skip files/directories that cannot be read?

I want to get all of the files in a directory in an array (including the files in subfolders)

string[] filePaths = Directory.GetFiles(@"c:\",SearchOption.AllDirectories);     

The problem with this is: If an exception is thrown the entire command stops. Is there a better way to do this so that if a folder cannot be accessed it will just skip over it?

like image 339
Wilson Avatar asked Aug 13 '12 01:08

Wilson


1 Answers

You'd probably have to do a bit more typing yourself then, and write a directory walker like this one:

    public static string[] FindAllFiles(string rootDir) {
        var pathsToSearch = new Queue<string>();
        var foundFiles = new List<string>();

        pathsToSearch.Enqueue(rootDir);

        while (pathsToSearch.Count > 0) {
            var dir = pathsToSearch.Dequeue();

            try {
                var files = Directory.GetFiles(dir);
                foreach (var file in Directory.GetFiles(dir)) {
                    foundFiles.Add(file);
                }

                foreach (var subDir in Directory.GetDirectories(dir)) {
                    pathsToSearch.Enqueue(subDir);
                }

            } catch (Exception /* TODO: catch correct exception */) {
                // Swallow.  Gulp!
            }
        }

        return foundFiles.ToArray();
    }
like image 184
ikh Avatar answered Nov 12 '22 21:11

ikh