Browse to the folder containing the files you want to count. Highlight one of the files in that folder and press the keyboard shortcut Ctrl + A to highlight all files and folders in that folder. In the Explorer status bar, you'll see how many files and folders are highlighted, as shown in the picture below.
The tool wc is the "word counter" in UNIX and UNIX-like operating systems, but you can also use it to count lines in a file by adding the -l option. wc -l foo will count the number of lines in foo .
Use File Explorer Open the folder and select all the subfolders or files either manually or by pressing CTRL+A shortcut. If you choose manually, you can select and omit particular files. You can now see the total count near the left bottom of the window. Repeat the same for the files inside a folder and subfolder too.
This simple one-liner should work in any shell, not just bash:
ls -1q log* | wc -l
ls -1q will give you one line per file, even if they contain whitespace or special characters such as newlines.
The output is piped to wc -l, which counts the number of lines.
You can do this safely (i.e. won't be bugged by files with spaces or \n
in their name) with bash:
$ shopt -s nullglob
$ logfiles=(*.log)
$ echo ${#logfiles[@]}
You need to enable nullglob
so that you don't get the literal *.log
in the $logfiles
array if no files match. (See How to "undo" a 'set -x'? for examples of how to safely reset it.)
Lots of answers here, but some don't take into account
-l
)*.log
instead of log*
logs
that matches log*
)Here's a solution that handles all of them:
ls 2>/dev/null -Ubad1 -- log* | wc -l
Explanation:
-U
causes ls
to not sort the entries, meaning it doesn't need to load the entire directory listing in memory-b
prints C-style escapes for nongraphic characters, crucially causing newlines to be printed as \n
.-a
prints out all files, even hidden files (not strictly needed when the glob log*
implies no hidden files)-d
prints out directories without attempting to list the contents of the directory, which is what ls
normally would do-1
makes sure that it's on one column (ls does this automatically when writing to a pipe, so it's not strictly necessary)2>/dev/null
redirects stderr so that if there are 0 log files, ignore the error message. (Note that shopt -s nullglob
would cause ls
to list the entire working directory instead.)wc -l
consumes the directory listing as it's being generated, so the output of ls
is never in memory at any point in time.--
File names are separated from the command using --
so as not to be understood as arguments to ls
(in case log*
is removed)The shell will expand log*
to the full list of files, which may exhaust memory if it's a lot of files, so then running it through grep is be better:
ls -Uba1 | grep ^log | wc -l
This last one handles extremely large directories of files without using a lot of memory (albeit it does use a subshell). The -d
is no longer necessary, because it's only listing the contents of the current directory.
For a recursive search:
find . -type f -name '*.log' -printf x | wc -c
wc -c
will count the number of characters in the output of find
, while -printf x
tells find
to print a single x
for each result.
For a non-recursive search, do this:
find . -maxdepth 1 -type f -name '*.log' -printf x | wc -c
The accepted answer for this question is wrong, but I have low rep so can't add a comment to it.
The correct answer to this question is given by Mat:
shopt -s nullglob
logfiles=(*.log)
echo ${#logfiles[@]}
The problem with the accepted answer is that wc -l counts the number of newline characters, and counts them even if they print to the terminal as '?' in the output of 'ls -l'. This means that the accepted answer FAILS when a filename contains a newline character. I have tested the suggested command:
ls -l log* | wc -l
and it erroneously reports a value of 2 even if there is only 1 file matching the pattern whose name happens to contain a newline character. For example:
touch log$'\n'def
ls log* -l | wc -l
If you have a lot of files and you don't want to use the elegant shopt -s nullglob
and bash array solution, you can use find and so on as long as you don't print out the file name (which might contain newlines).
find -maxdepth 1 -name "log*" -not -name ".*" -printf '%i\n' | wc -l
This will find all files that match log* and that don't start with .*
— The "not name .*" is redunant, but it's important to note that the default for "ls" is to not show dot-files, but the default for find is to include them.
This is a correct answer, and handles any type of file name you can throw at it, because the file name is never passed around between commands.
But, the shopt nullglob
answer is the best answer!
Here is my one liner for this.
file_count=$( shopt -s nullglob ; set -- $directory_to_search_inside/* ; echo $#)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With