Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

bash: passing entire command (with arguments) to a function

Tags:

bash

scripting

I am essentially trying to implement a function which asserts the failure (non-zero exit code) of another command, and prints a message when it fails.

Here is my function:

function assert_fail () {
    COMMAND=$@
    if [ `$COMMAND; echo $?` -ne 0 ]; then 
        echo "$COMMAND failed as expected."
    else
        echo "$COMMAND didn't fail"
    fi
}

# This works as expected
assert_fail rm nonexistent

# This works too
assert_fail rm nonexistent nonexistent2

# This one doesn't work
assert_fail rm -f nonexixtent

As soon as I add options to the command, it doesn't work. Here is the output of the above:

rm: cannot remove `nonexistent': No such file or directory
rm nonexistent failed as expected.
rm: cannot remove `nonexistent': No such file or directory
rm: cannot remove `nonexistent2': No such file or directory
rm nonexistent nonexistent2 failed as expected.
rm -f nonexistent didn't fail

I have tried putting double quotes around the commands, to no avail. I would expect the third invocation in the above to produce similar output to the other two.

I appreciate any/all help!

like image 842
Justin Lewis Avatar asked Oct 15 '12 15:10

Justin Lewis


People also ask

How do I pass an argument to a function in bash?

Bash Function Arguments To pass arguments to a function, add the parameters after the function call separated by spaces.

What does [- Z $1 mean in bash?

$1 means an input argument and -z means non-defined or empty. You're testing whether an input argument to the script was defined when running the script. Follow this answer to receive notifications.

How do you pass a command line argument to a function in shell script?

We can use the getopts program/ command to parse the arguments passed to the script in the command line/ terminal by using loops and switch-case statements. Using getopts, we can assign the positional arguments/ parameters from the command line to the bash variables directly.

How do I pass multiple arguments in bash?

You can pass more than one argument to your bash script. In general, here is the syntax of passing multiple arguments to any bash script: script.sh arg1 arg2 arg3 … The second argument will be referenced by the $2 variable, the third argument is referenced by $3 , .. etc.


2 Answers

@rici correctly pointed out the issue you're seeing, but there are a couple of real problems with your wrapper function. First, it doesn't correctly preserve spaces (and some other funny characters) in arguments. COMMAND=$@ (or COMMAND="$@") merges all of the arguments into a single string, losing the distinction between spaces between arguments and spaces within arguments. To keep them straight, either use "$@" directly without storing it in a variable, or store it as an array (COMMAND=("$@"), then execute it as "${COMMAND[@]}"). Second, if the command prints anything to stdout, it'll wreak havoc with your exit status check; just test it directly, as @chepner said. Here's my suggested rewrite:

function assert_fail () {
    if "$@"; then 
        echo "$* didn't fail"
    else
        echo "$* failed as expected."
    fi
}

Note that the way I did the echo commands does lose the distinction of spaces within arguments. If that's a problem, replace the echo commands with this:

printf "%q " "$@"
echo "didn't fail"

and

printf "%q " "$@"
echo "failed as expected."
like image 52
Gordon Davisson Avatar answered Oct 16 '22 06:10

Gordon Davisson


rm -f never fails on non-existent files. It has nothing to do with your wrapper. See man rm:

OPTIONS
       -f, --force
              ignore nonexistent files, never prompt
like image 24
rici Avatar answered Oct 16 '22 07:10

rici