I have my script using while read
to process some file line by line..
When I do:
head -n5 file1 | ./myscript.sh
I get my results well.
But trying to parallelize it using gnu parallel
:
head -n5 file1 | parallel -j 4 ./myscript.sh
yields result
file empty!?
I tried also with:
parallel -j 4 -a file1 ./myscript.sh
but still doesn't work. I was trying to do similar to what they say in documentation, but without any success. What am I doing wrong?
EDIT:
Maybe this can help:
head -n5 file1 | parallel -a - -j 4 echo #this works
head -n5 file1 | parallel -a - -j 4 ./myscript #this doesn't
Parallel executes Bash scripts in parallel via a concept called multi-threading. This utility allows you to run different jobs per CPU instead of only one, cutting down on time to run a script.
On Linux, there are three ways to run multiple commands in a terminal: The Semicolon (;) operator. The Logical OR (||) operator. The Logical AND (&&) operator.
xargs will run the first two commands in parallel, and then whenever one of them terminates, it will start another one, until the entire job is done. The same idea can be generalized to as many processors as you have handy. It also generalizes to other resources besides processors.
parallel
doesn't send the lines of input to stdin
of the command given to it, but appends the line to the command you give.
If you write it like you have, then you're effectively calling ./myscript.sh <INPUT>
, where you want to call ./myscript.sh
, and send the input as stdin
.
This should work:
head -n5 file1 | parallel -j 4 "echo {} | ./myscript.sh"
The {}
indicates to parallel
where you want the input to go, rather than the default of at the end.
--pipe is made for you:
cat file1 | parallel --pipe -N5 ./myscript.sh
But you need to change myscript.sh
so it does not save to result
but instead print the output to stdout. Then you can:
cat file1 | parallel --pipe -N5 ./myscript.sh > result
and avoid any mixing.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With