I'm trying to build a small stress test script to test how quickly a set of requests gets done.
Need to measure speed for 100 requests.
Problem is that I wouldn't know how to implement it, as it would require parallel url requests to be called. Any ideas?
Also, there is an awesome open-source pure-python distributed and scaleable locust framework that uses greenlets. It's great at simulating enormous amount of simultaneous users.
Why build it?
There are several available.
If you want to stay within the Python framework you might try Corey Goldberg's Pylot or its successor multi-mechanize.
Although this question is quite old, I'd like to point out a nifty tool about this problem. The tool is funkload.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With