The solution to this is to use the xargs command as shown alongside the curl command. The -P flag denotes the number of requests in parallel. The section <(printf '%s\n' {1.. 10}) prints out the numbers 1 – 10 and causes the curl command to run 10 times with 5 requests running in parallel.
Using xargs -P
option, you can run any command in parallel:
xargs -I % -P 8 curl -X POST --header "http://localhost:5000/example" \
< <(printf '%s\n' {1..400})
This will run give curl
command 400 times with max 8 jobs in parallel.
You can use xargs
with -P
option to run any command in parallel:
seq 1 200 | xargs -n1 -P10 curl "http://localhost:5000/example"
This will run curl
command 200 times with max 10 jobs in parallel.
This is an addition to @saeed's
answer.
I faced an issue where it made unnecessary requests to the following hosts
0.0.0.1, 0.0.0.2 .... 0.0.0.N
The reason was the command xargs
was passing arguments to the curl command. In order to prevent the passing of arguments, we can specify which character to replace the argument by using the -I
flag.
So we will use it as,
... xargs -I '$' command ...
Now, xargs
will replace the argument wherever the $
literal is found. And if it is not found the argument is not passed. So using this the final command will be.
seq 1 200 | xargs -I $ -n1 -P10 curl "http://localhost:5000/example"
Note: If you are using $
in your command try to replace it with some other character that is not being used.
Add “wait” at the end, and background them.
for ((request=1;request<=20;request++))
do
for ((x=1;x<=20;x++))
do
time curl -X POST --header "http://localhost:5000/example" &
done
done
wait
They will all output to the same stdout, but you can redirect the result of the time (and stdout and stderr) to a named file:
time curl -X POST --header "http://localhost:5000/example" > output.${x}.${request}.out 2>1 &
Adding to @saeed's
answer, I created a generic function that utilises function arguments to fire commands for a total of N
times in M
jobs at a parallel
function conc(){
cmd=("${@:3}")
seq 1 "$1" | xargs -n1 -P"$2" "${cmd[@]}"
}
$ conc N M cmd
$ conc 10 2 curl --location --request GET 'http://google.com/'
This will fire 10
curl commands at a max parallelism of two each.
Adding this function to the bash_profile.rc
makes it easier. Gist
Update 2020:
Curl can now fetch several websites in parallel:
curl --parallel --parallel-immediate --parallel-max 3 --config websites.txt
websites.txt file:
url = "website1.com"
url = "website2.com"
url = "website3.com"
Wanted to share my example how I utilised parallel xargs with curl.
The pros from using xargs that u can specify how many threads will be used to parallelise curl rather than using curl with "&" that will schedule all let's say 10000 curls simultaneously.
Hope it will be helpful to smdy:
#!/bin/sh
url=/any-url
currentDate=$(date +%Y-%m-%d)
payload='{"field1":"value1", "field2":{},"timestamp":"'$currentDate'"}'
threadCount=10
cat $1 | \
xargs -P $threadCount -I {} curl -sw 'url= %{url_effective}, http_status_code = %{http_code},time_total = %{time_total} seconds \n' -H "Content-Type: application/json" -H "Accept: application/json" -X POST $url --max-time 60 -d $payload
.csv file has 1 value per row that will be inserted in json payload
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With