When I try to perform a load test using httperf with high request rate, I get the following error:
» httperf --client=0/1 --server=www.xxxxx.com --port=80 --uri=/ --send-buffer=4096 --recv-buffer=16384 --num-conns=200 --rate=30 httperf --client=0/1 --server=staging.truecar.com --port=80 --uri=/ --rate=30 --send-buffer=4096 --recv-buffer=16384 --num-conns=200 --num-calls=1 httperf: warning: open file limit > FD_SETSIZE; limiting max. # of open files to FD_SETSIZE **Segmentation fault: 11**
The error raises when the "rate" is > 15
Versions:
httperf 0.9.0
OS X 10.7.1
As the warning states, the number of connections to the http server is exceeding the maximum number of allowed open file-descriptors. It's likely that even though httperf
is limiting the value to FD_SETSIZE, you're reaching beyond that limit.
You can check the limit value with ulimit -a
$ ulimit -a core file size (blocks, -c) 0 data seg size (kbytes, -d) unlimited file size (blocks, -f) unlimited max locked memory (kbytes, -l) unlimited max memory size (kbytes, -m) unlimited open files (-n) 256 pipe size (512 bytes, -p) 1 stack size (kbytes, -s) 8192 cpu time (seconds, -t) unlimited max user processes (-u) 709 virtual memory (kbytes, -v) unlimited
Try increasing the limit with ulimit -n <n>
$ ulimit -n 2048 $ ulimit -a core file size (blocks, -c) 0 data seg size (kbytes, -d) unlimited file size (blocks, -f) unlimited max locked memory (kbytes, -l) unlimited max memory size (kbytes, -m) unlimited open files (-n) 2048 pipe size (512 bytes, -p) 1 stack size (kbytes, -s) 8192 cpu time (seconds, -t) unlimited max user processes (-u) 709 virtual memory (kbytes, -v) unlimited
This is common practice on large web servers and the like, as a socket is essentially just an open file-descriptor.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With