In a /bin/sh
script, I'd like to check whether a directory contains only one subdir and no other files (aside from "." and "..", of course). I could probably parse the output of ls
, but I also understand that's generally a bad idea. Suggestions?
Reason for questions: When I zip
a folder on, say, a windows machine, and I unzip it under Linux, sometimes I get a directory whose contents are that of the original folder; sometimes I get a directory containing exactly one subdir, whose contents are that of the original folder. (I assume that there's something that varies in the way that I use zip
under Windows, or that the various windows machines I use are configured slightly differently, or ...who knows?) Anhow, I'd like, on the Linux side, to handle both kinds of results in more or less the same way, hence this question.
For those thinking "What if your Windows-side folder really did contain just one subdir?", it happens that that's OK in this case, although I grant that it's a corner-case for the problem specification.
Use the ls command to display the contents of a directory. The ls command writes to standard output the contents of each specified Directory or the name of each specified File, along with any other information you ask for with the flags.
How can I list directories only in Linux? Linux or UNIX-like system use the ls command to list files and directories. However, ls does not have an option to list only directories. You can use combination of ls command, find command, and grep command to list directory names only.
find
would be a good tool for this. It has some neat arguments:
-maxdepth 1
so it does not search recursively-type d
searching only for directories-printf 1
to overcome the problem with weird filenames (print 1
instead of the file name)The full command is then:
find DIRECTORY -maxdepth 1 -type d -printf 1
This will print one character for each directory plus the directory itself so you are looking for directories that prints two characters (find has a nice feature that ignores .
and ..
when searching).
Then, you want to check if there are no other (non-directory) files:
find DIRECTORY -maxdepth 1 ! -type d -printf 1
The full check will then be:
if [ "$(find DIRECTORY -maxdepth 1 -type d -printf 1 | wc -m)" -eq 2 \
-a "$(find DIRECTORY -maxdepth 1 ! -type d -printf 1 | wc -m)" -eq 0 ]; then
# It has only one subdirectory and no other content
fi
Or, you can make it one command using -printf
's %y
which prints file type (d
for directory):
if [ "$(find DIRECTORY -maxdepth 1 -printf %y)" = "dd" ]; then
# It has only one subdirectory and no other content
fi
Your biggest issue in /bin/sh
will be invisible files, since a *
doesn't catch then [by default].
This will do what you want, I think:
#!/bin/bash
count_()
{
echo $(( $# - 2 )) # -2 to remove . and ..
}
count()
{
count_ * .*
}
ITEM_COUNT=$(count)
Of course you can adapt it to take a path as an argument if you wish.
Example output:
bash-3.2$ count
3
bash-3.2$ ll
total 0
drwxr-xr-x 5 christopher wheel 170 Mar 18 2014 .
drwxrwxrwx 6 root wheel 204 Jul 5 12:28 ..
drwxr-xr-x 14 christopher wheel 476 Mar 18 2014 .git
drwxr-xr-x 5 christopher wheel 170 Mar 18 2014 bin
drwxr-xr-x 4 christopher wheel 136 Mar 18 2014 pylib
Another example:
sh-3.2$ count_()
> {
> echo $(( $# - 2 )) # -2 to remove . and ..
> }
sh-3.2$ count()
> {
> count_ * .*
> }
sh-3.2$ ITEM_COUNT=$(count)
sh-3.2$ echo ${ITEM_COUNT}
3
Sidenote:
You're right that different zip implementations handle things differently, but on Linux, many tools treat zip /path/to/folder
and zip /path/to/folder/
differently (which is absurdly irritating). If you're working in a controlled environment, you might want to instead normalize how things get zipped. However, if this is a user-facing thing, then that sucks.
If you're not using bash
as the invoking shell:
countFiles.sh
:
#!/bin/bash
count_()
{
echo $(( $# - 2 )) # -2 to remove . and ..
}
count_ * .*
scriptThatWantsTheCountOfFiles
:
#!/bin/tcsh
set count = `./countFiles.sh`
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With