I am wanting to execute a command with arguments read in from a file. This works fine, until 1 of the arguments needs to have a space.
I have tried grouping the words with quotes and backslashes, but neither have worked.
The functionality I am after is exactly what xargs
does, except I need to call a function rather than a command as it relies on other variables set up elsewhere in the script
script:
do_echo() {
echo '$1:' $1
echo '$2:' $2
}
line=`cat input.txt` #cat used for simplicity, can have more than 1 line
do_echo $line
input.txt:
"hello world" "hello back"
Expected result:
$1: hello world
$2: hello back
Observed result:
$1: "hello
$2: world"
EDIT:
I am using this to execute the same command multiple times with different inputs. There is up to 15 parameters per line, and could be upwards of 50 lines.
A tabular format would be ideal, although the current answer of putting each parameter on a line will work.
Unquoted variables (as in do_echo $line
) are strictly split at any character which is in the IFS
variable (which is by default set to tab,space,newline). Strictly means really strictly, there is no way to quote or escape the splitting.
The basic workaround is to define an otherwise unneeded character (for example colon :
) as splitting character.
for example
$ cat input.txt
hello world:hello back
$ line=$(head -n 1 input.txt)
$ OLDIFS=$IFS IFS=:
$ do_echo $line
$1: hello world
$2: hello back
$ IFS=$OLDIFS
Another workaround is using eval
but eval
is dangerous. You absolutely must trust the input!
$ cat input.txt
"hello world" "hello back"
$ line=$(head -n 1 input.txt)
$ eval do_echo "$line"
$1: hello world
$2: hello back
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With