Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How can I throttle the number of commands run inside a bash script?

Tags:

bash

Lets assume I have a bash script that executes code like so:

for i in $LIST; do
 /path/to/my/script.sh $i &
done

As you can see, I am pushing these scripts into the background, and allowing the parent script to execute as many commands as it can, as fast as it can. The problem is that my system will eventually run out of memory, as these commands take a about 15 or 20 seconds to run each instance.

I'm running one static script.sh file, and passing a simple variable (i.e. customer number) into the script. There are about 20,000 - 40,000 records that I am looping through at any given time.

My question is, how can I tell the system to only have X number of script.sh instances running at once. If too many are running, I want to pause the script until the number of scripts are below the threshold, and then continue.

Any ideas?

like image 879
Slickrick12 Avatar asked Jan 04 '12 22:01

Slickrick12


1 Answers

Two tools can do this

(note I have changed your file selection around because I think you should prepare for handling strange filenames, e.g. with spaces)

GNU xargs

find -iname '*.txt' -print0 | xargs -0 -r -n1 -P4 /path/to/my/script.sh

Runs parallel on 4 processors

Xjobs

find -iname '*.txt' -print0 | xjobs -0 /path/to/my/script.sh

Runs on as many processors you have. Xjobs does a better job at separating output of the various jobs than xargs.

Add -j4 to run 4 jobs in parallel

like image 183
sehe Avatar answered Oct 03 '22 01:10

sehe