Tcsh (Tenex C Shell) This open-source shell for Linux is best for programmers because its syntax is like the C language, so these users can use the scripting features in Tcsh without any knowledge of Bash.
Ansible can perform the function of not only setting up your systems, but also testing them for correctness. By contrast, a shell script can be very dangerous to run more than once unless you're extremely careful and know exactly what every line does. For example, what will those gpg commands do if I run them again?
I wrote quite complex shell scripts and my first suggestion is "don't". The reason is that is fairly easy to make a small mistake that hinders your script, or even make it dangerous.
That said, I don't have other resources to pass you but my personal experience. Here is what I normally do, which is overkill, but tends to be solid, although very verbose.
Invocation
make your script accept long and short options. be careful because there are two commands to parse options, getopt and getopts. Use getopt as you face less trouble.
CommandLineOptions__config_file=""
CommandLineOptions__debug_level=""
getopt_results=`getopt -s bash -o c:d:: --long config_file:,debug_level:: -- "$@"`
if test $? != 0
then
echo "unrecognized option"
exit 1
fi
eval set -- "$getopt_results"
while true
do
case "$1" in
--config_file)
CommandLineOptions__config_file="$2";
shift 2;
;;
--debug_level)
CommandLineOptions__debug_level="$2";
shift 2;
;;
--)
shift
break
;;
*)
echo "$0: unparseable option $1"
EXCEPTION=$Main__ParameterException
EXCEPTION_MSG="unparseable option $1"
exit 1
;;
esac
done
if test "x$CommandLineOptions__config_file" == "x"
then
echo "$0: missing config_file parameter"
EXCEPTION=$Main__ParameterException
EXCEPTION_MSG="missing config_file parameter"
exit 1
fi
Another important point is that a program should always return zero if completes successfully, non-zero if something went wrong.
Function calls
You can call functions in bash, just remember to define them before the call. Functions are like scripts, they can only return numeric values. This means that you have to invent a different strategy to return string values. My strategy is to use a variable called RESULT to store the result, and returning 0 if the function completed cleanly. Also, you can raise exceptions if you are returning a value different from zero, and then set two "exception variables" (mine: EXCEPTION and EXCEPTION_MSG), the first containing the exception type and the second a human readable message.
When you call a function, the parameters of the function are assigned to the special vars $0, $1 etc. I suggest you to put them into more meaningful names. declare the variables inside the function as local:
function foo {
local bar="$0"
}
Error prone situations
In bash, unless you declare otherwise, an unset variable is used as an empty string. This is very dangerous in case of typo, as the badly typed variable will not be reported, and it will be evaluated as empty. use
set -o nounset
to prevent this to happen. Be careful though, because if you do this, the program will abort every time you evaluate an undefined variable. For this reason, the only way to check if a variable is not defined is the following:
if test "x${foo:-notset}" == "xnotset"
then
echo "foo not set"
fi
You can declare variables as readonly:
readonly readonly_var="foo"
Modularization
You can achieve "python like" modularization if you use the following code:
set -o nounset
function getScriptAbsoluteDir {
# @description used to get the script path
# @param $1 the script $0 parameter
local script_invoke_path="$1"
local cwd=`pwd`
# absolute path ? if so, the first character is a /
if test "x${script_invoke_path:0:1}" = 'x/'
then
RESULT=`dirname "$script_invoke_path"`
else
RESULT=`dirname "$cwd/$script_invoke_path"`
fi
}
script_invoke_path="$0"
script_name=`basename "$0"`
getScriptAbsoluteDir "$script_invoke_path"
script_absolute_dir=$RESULT
function import() {
# @description importer routine to get external functionality.
# @description the first location searched is the script directory.
# @description if not found, search the module in the paths contained in $SHELL_LIBRARY_PATH environment variable
# @param $1 the .shinc file to import, without .shinc extension
module=$1
if test "x$module" == "x"
then
echo "$script_name : Unable to import unspecified module. Dying."
exit 1
fi
if test "x${script_absolute_dir:-notset}" == "xnotset"
then
echo "$script_name : Undefined script absolute dir. Did you remove getScriptAbsoluteDir? Dying."
exit 1
fi
if test "x$script_absolute_dir" == "x"
then
echo "$script_name : empty script path. Dying."
exit 1
fi
if test -e "$script_absolute_dir/$module.shinc"
then
# import from script directory
. "$script_absolute_dir/$module.shinc"
elif test "x${SHELL_LIBRARY_PATH:-notset}" != "xnotset"
then
# import from the shell script library path
# save the separator and use the ':' instead
local saved_IFS="$IFS"
IFS=':'
for path in $SHELL_LIBRARY_PATH
do
if test -e "$path/$module.shinc"
then
. "$path/$module.shinc"
return
fi
done
# restore the standard separator
IFS="$saved_IFS"
fi
echo "$script_name : Unable to find module $module."
exit 1
}
you can then import files with the extension .shinc with the following syntax
import "AModule/ModuleFile"
Which will be searched in SHELL_LIBRARY_PATH. As you always import in the global namespace, remember to prefix all your functions and variables with a proper prefix, otherwise you risk name clashes. I use double underscore as the python dot.
Also, put this as first thing in your module
# avoid double inclusion
if test "${BashInclude__imported+defined}" == "defined"
then
return 0
fi
BashInclude__imported=1
Object oriented programming
In bash, you cannot do object oriented programming, unless you build a quite complex system of allocation of objects (I thought about that. it's feasible, but insane). In practice, you can however do "Singleton oriented programming": you have one instance of each object, and only one.
What I do is: i define an object into a module (see the modularization entry). Then I define empty vars (analogous to member variables) an init function (constructor) and member functions, like in this example code
# avoid double inclusion
if test "${Table__imported+defined}" == "defined"
then
return 0
fi
Table__imported=1
readonly Table__NoException=""
readonly Table__ParameterException="Table__ParameterException"
readonly Table__MySqlException="Table__MySqlException"
readonly Table__NotInitializedException="Table__NotInitializedException"
readonly Table__AlreadyInitializedException="Table__AlreadyInitializedException"
# an example for module enum constants, used in the mysql table, in this case
readonly Table__GENDER_MALE="GENDER_MALE"
readonly Table__GENDER_FEMALE="GENDER_FEMALE"
# private: prefixed with p_ (a bash variable cannot start with _)
p_Table__mysql_exec="" # will contain the executed mysql command
p_Table__initialized=0
function Table__init {
# @description init the module with the database parameters
# @param $1 the mysql config file
# @exception Table__NoException, Table__ParameterException
EXCEPTION=""
EXCEPTION_MSG=""
EXCEPTION_FUNC=""
RESULT=""
if test $p_Table__initialized -ne 0
then
EXCEPTION=$Table__AlreadyInitializedException
EXCEPTION_MSG="module already initialized"
EXCEPTION_FUNC="$FUNCNAME"
return 1
fi
local config_file="$1"
# yes, I am aware that I could put default parameters and other niceties, but I am lazy today
if test "x$config_file" = "x"; then
EXCEPTION=$Table__ParameterException
EXCEPTION_MSG="missing parameter config file"
EXCEPTION_FUNC="$FUNCNAME"
return 1
fi
p_Table__mysql_exec="mysql --defaults-file=$config_file --silent --skip-column-names -e "
# mark the module as initialized
p_Table__initialized=1
EXCEPTION=$Table__NoException
EXCEPTION_MSG=""
EXCEPTION_FUNC=""
return 0
}
function Table__getName() {
# @description gets the name of the person
# @param $1 the row identifier
# @result the name
EXCEPTION=""
EXCEPTION_MSG=""
EXCEPTION_FUNC=""
RESULT=""
if test $p_Table__initialized -eq 0
then
EXCEPTION=$Table__NotInitializedException
EXCEPTION_MSG="module not initialized"
EXCEPTION_FUNC="$FUNCNAME"
return 1
fi
id=$1
if test "x$id" = "x"; then
EXCEPTION=$Table__ParameterException
EXCEPTION_MSG="missing parameter identifier"
EXCEPTION_FUNC="$FUNCNAME"
return 1
fi
local name=`$p_Table__mysql_exec "SELECT name FROM table WHERE id = '$id'"`
if test $? != 0 ; then
EXCEPTION=$Table__MySqlException
EXCEPTION_MSG="unable to perform select"
EXCEPTION_FUNC="$FUNCNAME"
return 1
fi
RESULT=$name
EXCEPTION=$Table__NoException
EXCEPTION_MSG=""
EXCEPTION_FUNC=""
return 0
}
Trapping and handling signals
I found this useful to catch and handle exceptions.
function Main__interruptHandler() {
# @description signal handler for SIGINT
echo "SIGINT caught"
exit
}
function Main__terminationHandler() {
# @description signal handler for SIGTERM
echo "SIGTERM caught"
exit
}
function Main__exitHandler() {
# @description signal handler for end of the program (clean or unclean).
# probably redundant call, we already call the cleanup in main.
exit
}
trap Main__interruptHandler INT
trap Main__terminationHandler TERM
trap Main__exitHandler EXIT
function Main__main() {
# body
}
# catch signals and exit
trap exit INT TERM EXIT
Main__main "$@"
Hints and tips
If something does not work for some reason, try to reorder the code. Order is important and not always intuitive.
do not even consider working with tcsh. it does not support functions, and it's horrible in general.
Hope it helps, although please note. If you have to use the kind of things I wrote here, it means that your problem is too complex to be solved with shell. use another language. I had to use it due to human factors and legacy.
Take a look at the Advanced Bash-Scripting Guide for a lot of wisdom on shell scripting - not just Bash, either.
Don't listen to people telling you to look at other, arguably more complex languages. If shell scripting meets your needs, use that. You want functionality, not fanciness. New languages provide valuable new skills for your resume, but that doesn't help if you have work that needs to be done and you already know shell.
As stated, there aren't a lot of "best practices" or "design patterns" for shell scripting. Different uses have different guidelines and bias - like any other programming language.
shell script is a language designed to manipulate files and processes. While it's great for that, it's not a general purpose language, so always try to glue logic from existing utilities rather than recreating new logic in shell script.
Other than that general principle I've collected some common shell script mistakes.
Know when to use it. For quick and dirty gluing commands together it's okay. If you need to make any more than few non-trivial decisions, loops, anything, go for Python, Perl, and modularize.
The biggest problem with shell is often that end result just looks like a big ball of mud, 4000 lines of bash and growing... and you can't get rid of it because now your whole project depends on it. Of course, it started at 40 lines of beautiful bash.
There was a great session at OSCON this year (2008) on just this topic: http://assets.en.oreilly.com/1/event/12/Shell%20Scripting%20Craftsmanship%20Presentation%201.pdf
use set -e so you don't plow forward after errors. Try making it sh compatible without relying on bash if you want it to run on not-linux.
Easy: use python instead of shell scripts. You get a near 100 fold increase in readablility, without having to complicate anything you don't need, and preserving the ability to evolve parts of your script into functions, objects, persistent objects (zodb), distributed objects (pyro) nearly without any extra code.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With