I have a Unix bash function that executes a script that parses custom environment variables. I want to avoid exporting the relevant variables in the bash, and instead set them only for the script, as part of the execution command.
If I set the variables directly in the command -- e.g., VARNAME=VARVAL script_name
-- it works well. However, since I want to set multiple variables, based on different conditions, I want to use a local function variable to store the environment variable settings, and then use this variable in the script execution command.
I have a local "vars" variable that is ultimately set, e.g., to VARNAME=VAR
, but if I try to run ${vars} script_name
from my bash function, I get a "command not found" error for the $vars variable assignment -- i.e., the content of $vars is interpreted as a command instead of as environment variables assignment.
I tried different variations of the command syntax, but so far to no avail. Currently I have to export the relevant variables in the function, before calling the script, and then unset/reset them to the previous values, but this is not really the solution I was hoping for.
Any help would be greatly appreciated.
Thanks, Sharon
To evaluate the contents of your variables as an expression instead of as a command you can try to use eval
:
eval ${vars} script_Name
However, since I want to set multiple variables, based on different conditions, I want to use a local function variable to store the environment variable settings, and then use this variable in the script execution command.
You don't need to store the variables in a separate variable. You can assign more than one variable for a command:
$ cat test.sh
#!/usr/bin/env bash
echo "$foo"
echo "$bar"
$ foo=abc bar=def ./test.sh
abc
def
This also has the advantage of being safer than eval
.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With