Need help in a bash script. The goal is: - Run several commands in parallel - Exit 1 if any command return not-zero exit status
I.e.
Run with middle command has error:
$ ./parallel_commands "echo 1" "_echo 2" "echo 3" && echo "OK"
1
3
./parallel_commands: line 4: _echo: command not found
OK <- Incorrect
Run with all commands have errors:
$ ./parallel_commands "_echo 1" "_echo 2" "_echo 3" && echo "OK"
./parallel_commands: line 4: _echo: command not found
./parallel_commands: line 4: _echo: command not found
./parallel_commands: line 4: _echo: command not found
-> Result is fail -> Correct
Bash script:
#!/bin/bash
for cmd in "$@"; do {
$cmd & pid=$!
PID_LIST+=" $pid";
} done
trap "kill $PID_LIST" SIGINT
wait $PID_LIST
Thanks.
Exit When Any Command Fails This can actually be done with a single line using the set builtin command with the -e option. Putting this at the top of a bash script will cause the script to exit if any commands return a non-zero exit code.
Running Commands in Parallel using Bash Shell The best method is to put all the wget commands in one script, and execute the script. The only thing to note here is to put all these wget commands in background (shell background). See our simple script file below. Notice the & towards the end of each command.
Using the Semicolon (;) Operator For instance, if there are two commands: command A and command B, using the semicolon operator in between them ensures that both the first and the second command get executed sequentially regardless of the output of the first command.
Method #1: Using the Semicolon Operator Here, you can have as many commands as you want to run in parallel separated by semicolons.
You are probably looking for something like this using GNU Parallel:
parallel ::: "echo 1" "_echo 2" "echo 3" && echo OK
GNU Parallel is a general parallelizer and makes is easy to run jobs in parallel on the same machine or on multiple machines you have ssh access to.
If you have 32 different jobs you want to run on 4 CPUs, a straight forward way to parallelize is to run 8 jobs on each CPU:
GNU Parallel instead spawns a new process when one finishes - keeping the CPUs active and thus saving time:
Installation
If GNU Parallel is not packaged for your distribution, you can do a personal installation, which does not require root access. It can be done in 10 seconds by doing this:
(wget -O - pi.dk/3 || curl pi.dk/3/ || fetch -o - http://pi.dk/3) | bash
For other installation options see http://git.savannah.gnu.org/cgit/parallel.git/tree/README
Learn more
See more examples: http://www.gnu.org/software/parallel/man.html
Watch the intro videos: https://www.youtube.com/playlist?list=PL284C9FF2488BC6D1
Walk through the tutorial: http://www.gnu.org/software/parallel/parallel_tutorial.html
Sign up for the email list to get support: https://lists.gnu.org/mailman/listinfo/parallel
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With