I have a file containing command lines that I want to run. This file contains around 2,000 lines.
I have 8 cores available. Is it possible to parse the file and start 8 processes, then execute another one from the file whenever one of the programs finishes? I want this to continue until the end of file is reached.
Running Commands in Parallel using Bash Shell The best method is to put all the wget commands in one script, and execute the script. The only thing to note here is to put all these wget commands in background (shell background). See our simple script file below. Notice the & towards the end of each command.
Segmenting a chain of commands with the semicolon is the most common practice when you want to run multiple commands in a terminal. Part of the reason for this is the way the operator performs: it runs all the commands in the sequence irrespective of whether the previous command ran successfully or failed.
Method #1: Using the Semicolon Operator Here, you can have as many commands as you want to run in parallel separated by semicolons.
Use GNU parallel. It's an incredibly powerful tool and official packages exist for about 20 or so linux distros. What's that? You have an excuse as to why you can't use it? Here's a simple example showing how to run a list or file of commands in parallel:
Contents of jobs.txt
:
sleep 1; echo "a"
sleep 3; echo "b"
sleep 2; echo "c"
Command:
time parallel :::: jobs.txt
Results:
a
c
b
real 0m3.332s
user 0m0.170s
sys 0m0.037s
Notes:
If you wish to keep the order the same as the input, pass the -k
flag to GNU parallel.
If you have more than eight cores and only wish to process with eight cores, add -j 8
to the args list.
The man page is a good read, but if you haven't already read this tutorial I would highly recommend the time investment.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With