Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

What's a typical average number of ASP.NET sessions per CPU or per memory?

(EDIT: rewritten question to make it clearer, meaning hasn't changed)

You can create an application and measure its usage. But what I would like to know, if you decide up-front about an ASP.NET application, how much simultaneous users (sessions) fit into one machine typically.

Let's assume the following default simplified setup: Inproc sessions, ASP.NET 3.5, NHibernate + L2 caching, shopping site (basket properties in session).

While I could ascertain that the session won't raise above, say, 20kB, my experience shows me that there's a huge overhead in general, even with well-laid-out applications. I'm looking for that simple calculation you can make on a sticky note.

For the bounty: what CPU / Mem would you advise your management for each X simultaneous users, ignoring bandwidth requirements. I.e. and answer could be: on a 2GHz Xeon with 1GB mem, Win2k8, you can safely serve 500 simultaneous sessions, but above that it requires careful planning or more hardwarere

like image 220
Abel Avatar asked Dec 18 '09 19:12

Abel


3 Answers

Since you're looking for an actual #, I'll provide you some. We are creating a secure HR application utilize ASP.NET MVC. Like you we wanted to get a good feeling for maximum concurrent connections, which we defined as the max # of pages served in 10 seconds (assumes that a user would not wait more than 10 seconds for a page).

Since we were looking for an upper bound, we used a very simple page. SSL + a few session variables. On a Dual Xeon quad core (8 cores total), with 16GB of memory and SQL Express as the backend, we were able to hit ~1,000 "concurrent" connections. Neither memory or SQL express were the limiting factors though, it was primarily processor & I/O for our test. Note that we did not use caching, although for a shopping cart I doubt you would either. This page hit the database ~3 times, and sent ~150KB of data (mostly png images, uncached). We verified that 1,000 sessions were created, although each was small.

Our POV is that 1,000 is likely unrealistic. Tests including pages with business logic and real user data showed ~200 concurrent users max. However, some users will also be running reports, which could chew up an entire core for up to 30 seconds. In this scenario, 9 concurrent reports users could basically make the system unsuitable for others. This is to the other poster's points ... you can grab other performance #s all you want, but your system might behave entirely differently based on what it's doing.

like image 53
Beep beep Avatar answered Oct 06 '22 12:10

Beep beep


do you know the "quality" of the code?

bad code can cost huge in hardware while good code can cost nothing

Update based on the comment

A few years ago, I had to maintain an apps badly done, it was using 500 megs ram(sometime 1.5gig) and was taking minutes to show stuff, I had to rewrite the whole thing and after that, it was only taking the necessary amount of memory(close to 10-15x less) and it was quick at showing stuff, I'm talking in millisecond here.

The number of loop and badly caching data in memory that was done wrong was... incredibly sad to look at. Just to tell you, I had 3 versions of a whole freaking database in memory(so 4 with the real db) and the code had to update all versions one after the other. Everything else in the apps was based on the versions in memory.

Anyway, in the end. I deleted 25 thousand lines of code.

Quality of the code IS important.

second update

found this, might be good

third update

In an application that I'm currently developing, asp.net 3.5 using linq to sql talking(of course) with sql server 2005. many read to the db and not so many write.

on my own dev machine which is old p4 prescott with 3 gig of ram. it take an average of 20ms to 100ms to load a whole page, depend which page :-)

session(memory usage) is very low, way under 20k for sure

if I go from here, my bad math would be;

If I have 100 simultaneous users, it would take about 2secs to load a page and it would use at least 2 meg of ram for the duration of the session.

the bad math needed? what do you need for 1 user and from that, just do 1 user multiply by WhatYouThinkYouShouldBeAbleToHandle

I don't think there is any other way to find out. Because again, the code under the page does matter.

like image 34
Fredou Avatar answered Oct 06 '22 11:10

Fredou


You obviously understand it depends on the app and the best way to understand what an app can do or support is to measure it. In the past I've used the transaction cost analysis methodology from Microsoft to get some fairly good estimates. I've used it back in the day with Site Server Commerce Edition 3.0 and today with modern ASP.net apps and it works fairly well.

This link is a snippet from "Improving .Net Application Performance and Scalability" book from Microsoft and it details formulas you can use with performance data (CPU usage, IIS counter etc) to calculate the number of users you can support on your site. I couldn't post a second link to the book but if you search for scalenet.pdf on Google/Bing you'll find it. Hope this helps

like image 22
Ameer Deen Avatar answered Oct 06 '22 11:10

Ameer Deen