Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Have bash script execute multiple programs as separate processes

As the title suggests how do I write a bash script that will execute for example 3 different Python programs as separate processes? And then am I able to gain access to each of these processes to see what is being logged onto the terminal?

Edit: Thanks again. I forgot to mention that I'm aware of appending & but I'm not sure how to access what is being outputted to the terminal for each process. For example I could run all 3 of these programs separately on different tabs and be able to see what is being outputted.

like image 405
Petesta Avatar asked Aug 12 '13 22:08

Petesta


People also ask

How do I run parallel jobs in shell script?

Examples. To download all files in parallel using wget: #!/bin/bash # Our custom function cust_func(){ wget -q "$1" } while IFS= read -r url do cust_func "$url" & done < list. txt wait echo "All files are downloaded."

How do I run multiple commands in bash script?

On Linux, there are three ways to run multiple commands in a terminal: The Semicolon (;) operator. The Logical OR (||) operator. The Logical AND (&&) operator.

How do I run two Bash commands in parallel?

Running Commands in Parallel using Bash Shell The best method is to put all the wget commands in one script, and execute the script. The only thing to note here is to put all these wget commands in background (shell background). See our simple script file below. Notice the & towards the end of each command.

How do I run Parallels Bash?

To run script in parallel in bash, you must send individual scripts to background. So the loop will not wait for the last process to exit and will immediately process all the scripts.


2 Answers

You can run a job in the background like this:

command &

This allows you to start multiple jobs in a row without having to wait for the previous one to finish.

If you start multiple background jobs like this, they will all share the same stdout (and stderr), which means their output is likely to get interleaved. For example, take the following script:

#!/bin/bash
# countup.sh

for i in `seq 3`; do
    echo $i
    sleep 1
done

Start it twice in the background:

./countup.sh &
./countup.sh &

And what you see in your terminal will look something like this:

1
1
2
2
3
3

But could also look like this:

1
2
1
3
2
3

You probably don't want this, because it would be very hard to figure out which output belonged to which job. The solution? Redirect stdout (and optionally stderr) for each job to a separate file. For example

command > file &

will redirect only stdout and

command > file 2>&1 &

will redirect both stdout and stderr for command to file while running command in the background. This page has a good introduction to redirection in Bash. You can view the command's output "live" by tailing the file:

tail -f file

I would recommend running background jobs with nohup or screen as user2676075 mentioned to let your jobs keep running after you close your terminal session, e.g.

nohup command1 > file1 2>&1 &
nohup command2 > file2 2>&1 &
nohup command3 > file3 2>&1 &
like image 79
ThisSuitIsBlackNot Avatar answered Oct 20 '22 06:10

ThisSuitIsBlackNot


Try something like:

command1 2>&1 | tee commandlogs/command1.log ;
command2 2>&1 | tee commandlogs/command2.log ; 
command3 2>&1 | tee commandlogs/command3.log
...

Then you can tail the files as the commands run. Remember, you can tail them all by being in the directory and doing a "tail *.log"

Alternatively, you can setup a script to generate a screen for each command with:

screen -S CMD1 -d -m command1 ;
screen -S CMD2 -d -m command2 ;
screen -S CMD3 -d -m command3
...

Then reconnect to them later with screen --list and screen -r [screen name]

Enjoy

like image 31
Alex Atkinson Avatar answered Oct 20 '22 06:10

Alex Atkinson