I have a SOAP Web Service written in Java and using Spring-ws.
I need to know that if this can handle 2 million requests per day, and how its performance would be.
-To what extend the massive usage performance is related with my java code and architecture, anything I can improve on it?
-And which extend this is related with the Application Server I use, which app server should I use, what are the limitations, or settings..how can I set and test this performance?
Thanks
With SOAP having an architectural underpinning in the HTTP protocol there are literally dozens of commercial and open source tools which you can use to perform your load and scalability tests.
What you will want to do is make sure that whatever tool you select meets the technical needs of exercising your interface correctly, is a good match your your in house skills and contains the reporting that you need to determine success or failure in your testing efforts. If you get a tool which is a mismatch on any of the three fronts then you may as well have purchased the most expensive tool on the market and hired the most expensive consulting form to use them....sort or like driving nails with the butt end of a screwdriver instead of using the proper tool (a hammer).
Most of the commercial tools on the market today have lease/rental options for short term use, and then there are the open source varieties as well. Each tool has an engineered efficiency associated with core tasks such as script construction, test construction, test execution and analysis which is distinct to the tool. The commercial tools tend to have a more balanced weight across the tasks while the open source ones tend to have higher LsOE required on the edge tasks of script creation and analysis.
Given that you are going to be working with at least a couple of million samples (assuming you will want to run for at least 24 hours) then you need to make sure that the analysis tools have a demonstrable track record with large data sets. The long standing commercial performance test tools all have demonstrable track records at this level, the open source ones are hit and miss and in some cases analysis becomes a roll-your-own proposition against logged response time data. You can see where you can burn lots of time building your own analysis engine here!
What you want to do is technically possible. You may want to re-examine your performance requirements, here is why. I happen to be working with an organization that today uses a web services interface to service the needs of clients around the world. Their backend archive of transactional data is approaching 250TB of data from over ten years worth of work. On an hourly basis over the past year the high water mark was around 60K requests per hour. Projected over a 24 hour basis this still works out to less than your 2 million requests per day. If you test to this level and you find issues are you finding genuine issues or are you finding engineering ghosts, things that would never occur in production due to the differences in production versus test volumes. Modeling your load properly is always difficult, but the time spent nailing the load model for your mix of transactions and proper volume will be time well spent in not using your developer skills to chase performance ghosts and burning budget while doing so.
Good luck!
You could also use the StresStimulus which is a plugin for Fiddler
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With