I was trying to do this to decide whether to redirect stdin to a file or not:
[ ...some condition here... ] && input=$fileName || input="&0"
./myScript < $input
But that doesn't work because when the variable $input is "&0", bash interprets it as a filename.
However, I could just do:
if [ ...condition... ];then
./myScript <$fileName
else
./myScript
The problem is that ./myScript is actually a long command line that I don't want to duplicate, nor do I want to create a function for it because it's not that long either (it's not worth it).
Then it occurred to me to do this:
[ ...condition... ] && input=$fileName || input= #empty
cat $input | ./myScript
But that requires to run one more command and a pipe (i.e. a subshell).
Is there another way that's simpler and more efficient?
First of all stdin is file descriptor 0 (zero) rather than 1 (which is stdout).
You can duplicate file descriptors or use filenames conditionally like this:
[[ some_condition ]] && exec 3<"$filename" || exec 3<&0 some_long_command_line <&3
Note that the command shown will execute the second exec
if either the condition is false or the first exec
fails. If you don't want a potential failure to do that then you should use an if
/ else
:
if [[ some_condition ]] then exec 3<"$filename" else exec 3<&0 fi
but then subsequent redirections from file descriptor 3 will fail if the first redirection failed (after the condition was true).
Standard input can also be represented by the special device file /dev/stdin
, so using that as a filename will work.
file="/dev/stdin" ./myscript < "$file"
(
if [ ...some condition here... ]; then
exec <$fileName
fi
exec ./myscript
)
In a subshell, conditionally redirect stdin and exec the script.
How about
function runfrom {
local input="$1"
shift
case "$input" in
-) "$@" ;;
*) "$@" < "$input" ;;
esac
}
I've used the minus sign to denote standard input because that's traditional for many Unix programs.
Now you write
[ ... condition ... ] && input="$fileName" || input="-"
runfrom "$input" my-complicated-command with many arguments
I find these functions/commands which take commands as arguments (like xargs(1)
) can be very useful, and they compose well.
If you're careful, you can use 'eval
' and your first idea.
[ ...some condition here... ] && input=$fileName || input="&1"
eval ./myScript < $input
However, you say that 'myScript' is actually a complex command invocation; if it involves arguments which might contain spaces, then you must be very careful before deciding to use 'eval
'.
Frankly, worrying about the cost of a 'cat
' command is probably not worth the trouble; it is unlikely to be the bottleneck.
Even better is to design myScript
so that it works like a regular Unix filter - it reads from standard input unless it is given one or more files to work (like, say, cat
or grep
as examples). That design is based on long and sound experience - and is therefore worth emulating to avoid having to deal with problems such as this.
Use eval
:
#! /bin/bash
[ $# -gt 0 ] && input="'"$1"'" || input="&1"
eval "./myScript <$input"
This simple stand-in for myScript
#! /usr/bin/perl -lp
$_ = reverse
produces the following output:
$ ./myDemux myScript pl- lrep/nib/rsu/ !# esrever = _$ $ ./myDemux foo oof bar rab baz zab
Note that it handles spaces in inputs too:
$ ./myDemux foo\ bar eman eht ni ecaps a htiw elif
To pipe input down to myScript
, use process substitution:
$ ./myDemux <(md5sum /etc/issue) eussi/cte/ 01672098e5a1807213d5ba16e00a7ad0
Note that if you try to pipe the output directly, as in
$ md5sum /etc/issue | ./myDemux
it will hang waiting on input from the terminal, whereas ephemient's answer does not have this shortcoming.
A slight change produces the desired behavior:
#! /bin/bash
[ $# -gt 0 ] && input="'"$1"'" || input=/dev/stdin
eval "./myScript <$input"
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With