I currently have a shell script which relies on a curl command like this:
curl --request POST -u name:pass -H "Content-Type: application/json"
--data "{data}" https://url.com --cacert ./my_crt
I don't need the response of the command, and this command is in a big for loop, so waiting for the responses take a lot of time.
so, is there a way in bash to do exactly the same thing, but without waiting for the response?
If you have a large number of requests you want to issue quickly, and you don't care about the output, there are two things you should do:
For small requests, it's generally much faster to do 10 requests each on 1 connection, than 1 request each on 10 connections. For Henry's HTTP post test server, the difference is 2.5x:
$ time for i in {1..10}; do
curl -F foo=bar https://posttestserver.com/post.php ;
done
Successfully dumped 1 post variables.
View it at http://www.posttestserver.com/data/2016/06/09/11.44.48536583865
Post body was 0 chars long.
(...)
real 0m2.429s
vs
$ time {
array=();
for i in {1..10}; do
array+=(--next -F foo=bar https://posttestserver.com/post.php ) ;
done;
curl "${array[@]}";
}
Successfully dumped 1 post variables.
View it at http://www.posttestserver.com/data/2016/06/09/11.45.461371907842
(...)
real 0m1.079s
Here sem
from GNU parallel is limiting the number of parallel connections to 4. This is a better version of backgrounding and waiting, since it will always ensure full capacity.
for i in {1..20}
do
sem -j 4 curl -F foo=bar https://posttestserver.com/post.php
done
sem --wait
The number of parallel requests you want depends on how beefy the host is. A realistic number could be 32+
Combine the two strategies, and you should see a hefty speedup without DoS'ing yourself.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With