I developed a web server program that only serve static files such as html, image and etc. Now, I want to compare it with other web server on different workloads. My design of the web server program is to make better use of file caching and hopeful improve performance on access patterns that follow similar routines/workloads.
Is there any existing large static website scripts that are particularly good for testing different access behaviours and workloads? Also, are there any good workload generators for this purpose?
For example, simulate typical load behaviour:
Load Page1.html-> Load Page2.html -> Download a random file from the list in Page2.html->Exit
I believe Jmeter maybe useful for this, but I couldn't find any ready made static web page files and workload scripts. Any other existing tools or framework suggestions please?
Thanks
There are a few different approaches on how to tackle a problem like this with JMeter;
You can collect all links (or a random pick) in a page using the Regular Expression Extractor postprocessor using them to fetch pages in a ForEach Controller.
You can also start with one or more start pages, selecting random links to follow until the start page or an error condition occurs. Use the If Controller to wrap other controllers to stop processing on such a condition.
Place timers between samplers and use JMeter variables to define its parameters so that you can easilly turn up the speed while you add more and more threads.
The Ultimate threadgroup and Stepping threadgroup which are part of the jp@gc project offer extra scheduling functionality.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With