I've got to get a directory listing that contains about 2 million files, but when I do an ls
command on it nothing comes back. I've waited 3 hours. I've tried ls | tee directory.txt
, but that seems to hang forever.
I assume the server is doing a lot of inode sorting. Is there any way to speed up the ls
command to just get a directory listing of filenames? I don't care about size, dates, permission or the like at this time.
Type the ls command and press enter to view all of the listed files and folders within your target directory. 4. Similarly, you can even use the dir or gci command to view the contents of a folder.
If you have worked in Linux, you may be familiar with the ls command. Ls is short for “list”. This command lists information about directories and any type of files in the working directory.
ls -U
will do the ls without sorting.
Another source of slowness is --color
. On some linux machines, there is a convenience alias which adds --color=auto'
to the ls call, making it look up file attributes for each file found (slow), to color the display. This can be avoided by ls -U --color=never
or \ls -U
.
I have a directory with 4 million files in it and the only way I got ls to spit out files immediately without a lot of churning first was
ls -1U
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With