We develop an enterprise application for which we need to document the minimum hardware requirements for the following target deployments:
Some of the ideas we have tossed around include basing the requirements on our test environments, basing the requirements on the highest specs of each target's components, and basing specs on currently available hardware.
How do you come up with your hardware specs?
In general, we take the minimum recommendations for our environment (ie the minimum recommendation for .NET on client or for IIS/SQL Server on the datbase) as a vague baseline.
In general we know the application size / expected database size for our application based on the client size.
After this we add some 'fudge' numbers based on observations made with Windows PerfMon. We watch the client memory / cpu usage of the system when running under normal conditions. For the server tier we will also take into account the memory / processor load when running under loaded conditions.
Based on all of this we come up with our best guesses.
I've some machines in my test environment on which the minimum hardware & software requirements depend. I think this is the only "safe" way to define such metrics.
On those machines I run all kind of test at least three times a week do be sure, that the application has enough resources after all the changes. So if you change your test machines, you also change the minimum requirements.
On some projects we define the minimum with our client and buy / build some test machines with this minimum to test again...
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With