When writing bash scripts I like to write self contained functions which take an argument and perform operations based on that/those argument(s) rather than have global variables being declared in several different places in the code decreasing readability.
The issue arises when you have a function which needs to make use of several variables. Passing something like 10 variables to a function is just plain ugly and for that a simple associative array can be used.
If we want to declare those variables on an external file, the "source" command allows you to import them all in.
The issue then becomes, how do I list the variables declared ONLY inside this file so I can build my associative array with them? I have been able to use a combination of "compgen" and a loop to build associative arrays out of a list of variables but how does one list only the variables found inside the file, regardless of what they are called, so I can loop through them and build my array?
You could do an egrep for some variable declaring syntax on your file and then get the variable name by cutting it, for example like this:
egrep '[a-zA-Z0-9"'\''\[\]]*=' /path/to/file |egrep -v '^#' |cut -d'=' -f1 |awk '{print $1}'
If you had a file with content like this
#!/bin/bash
A="test"
somerandomfunction () {
echo "test function"
B="test"
}
#C="test"
DEF="test"
GHI1="test"
JKL[1]="test"
JKL['a']="test"
JKL["b"]="test"
the output of the above command would look like this:
A
B
DEF
GHI1
JKL[1]
JKL['a']
JKL["b"]
Explanation of the commands:
a-z
) and/or uppercase (A-Z
) letters and/or square brackets (\[\]
) and/or single ('\''
) and/or double ("
) quotation marks followed by a =
.#
since those are usually interpreted as comments and would not generate or set a variable.=
to the line ending.The output of the command could be used in a loop or something like this:
for VAR in $(egrep '[a-zA-Z0-9"'\''\[\]]*=' /path/to/file |egrep -v '^#' |cut -d'=' -f1 |awk '{print $1}'); do
eval echo "\$${VAR}"
done
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With