I need a bash script to run some jobs in the background, three jobs at a time.
I know can do this in the following way, and for illustration, I will assume the number of jobs is 6:
./j1 &
./j2 &
./j3 &
wait
./j4 &
./j5 &
./j6 &
wait
However, this way, if, for example, j2 takes a lot longer to run that j1 and j3, then, I will be stuck with only one background job running for a long time.
The alternative (which is what I want) is that whenever one job is completed, bash should start the next job in the queue so that a rate of 3 jobs at any given time is maintained. Is it possible to write a bash script to implement this alternative, possibly using a loop? Please note that I need to run far more jobs, and I expect this alternative method to save me a lot of time.
Here is my draft of the script, which I hope you can help me to verify its correctness and improve it, as I'm new to scripting in bash. The ideas in this script are taken and modified from here, here, and here):
for i in $(seq 6)
do
# wait here if the number of jobs is 3 (or more)
while (( (( $(jobs -p | wc -l) )) >= 3 ))
do
sleep 5 # check again after 5 seconds
done
jobs -x ./j$i &
done
wait
IMHO, I think this script does the required behavior. However, I need to know -from bash experts- if I'm doing something wrong or if there is a better way of implementing this idea.
Thank you very much.
If we want to put a process in the background, we can use the ampersand ( & ) sign behind any command. This will place the process in the background, and reports back the PID (Process ID, an identifier number which identifies any process running on a Linux machine). In this example, the PID is 25867 .
Running shell command or script in background using nohup command. Another way you can run a command in the background is using the nohup command. The nohup command, short for no hang up, is a command that keeps a process running even after exiting the shell.
The Bash continue statement resumes the following iteration in a loop or looping statement. The continue statement only has meaning when applied to loops. The integer value indicates the depth for the continue statement. By default, the integer is 1 and writing the number is not mandatory.
A bash for loop is a bash programming language statement which allows code to be repeatedly executed. A for loop is classified as an iteration statement i.e. it is the repetition of a process within a bash script. For example, you can run UNIX command or task 5 times or read and process list of files using a for loop.
With GNU xargs:
printf '%s\0' j{1..6} | xargs -0 -n1 -P3 sh -c './"$1"' _
With bash (4.x) builtins:
max_jobs=3; cur_jobs=0
for ((i=0; i<6; i++)); do
# If true, wait until the next background job finishes to continue.
((cur_jobs >= max_jobs)) && wait -n
# Increment the current number of jobs running.
./j"$i" & ((++cur_jobs))
done
wait
Note that the approach relying on builtins has some corner cases -- if you have multiple jobs exiting at the exact same time, a single wait -n
can reap several of them, thus effectively consuming multiple slots. If we wanted to be more robust, we might end up with something like the following:
max_jobs=3
declare -A cur_jobs=( ) # build an associative array w/ PIDs of jobs we started
for ((i=0; i<6; i++)); do
if (( ${#cur_jobs[@]} >= max_jobs )); then
wait -n # wait for at least one job to exit
# ...and then remove any jobs that aren't running from the table
for pid in "${!cur_jobs[@]}"; do
kill -0 "$pid" 2>/dev/null && unset cur_jobs[$pid]
done
fi
./j"$i" & cur_jobs[$!]=1
done
wait
...which is obviously a lot of work, and still has a minor race. Consider using xargs -P
instead. :)
Using GNU Parallel:
parallel -j3 ::: ./j{1..6}
Or if your shell does not do .. expansion (e.g. csh):
seq 6 | parallel -j3 ./j'{}'
If you think you cannot install GNU Parallel, please read http://oletange.blogspot.dk/2013/04/why-not-install-gnu-parallel.html and leave a comment on why you cannot install it.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With