I have the following code that I benchmark with jMeter and get about 3000 request per second on my localhost machine(the await
is missing intentionally to run synchronously):
public async Task<HttpResponseMessage> Get()
{
var resp = new HttpResponseMessage(HttpStatusCode.OK);
resp.Content = new StringContent(Thread.CurrentThread.ManagedThreadId.ToString(), Encoding.UTF8, "text/plain");
return resp;
}
The problem is that when I pause the request for one second like below, for some reason the throughput is down to 10 requests per second for each w3wp.exe process (again the await
is missing intentionally to run synchronously):
public async Task<HttpResponseMessage> Get()
{
Task.Delay(1000).Wait();
var resp = new HttpResponseMessage(HttpStatusCode.OK);
resp.Content = new StringContent(Thread.CurrentThread.ManagedThreadId.ToString(), Encoding.UTF8, "text/plain");
return resp;
}
Even when I do use await
there is no difference and the 10 requests per second does not improve at all:
public async Task<HttpResponseMessage> Get()
{
await Task.Delay(1000);
var resp = new HttpResponseMessage(HttpStatusCode.OK);
resp.Content = new StringContent(Thread.CurrentThread.ManagedThreadId.ToString(), Encoding.UTF8, "text/plain");
return resp;
}
I tried all the config settings and nothing makes any change at all: `
web.config
<system.net>
<connectionManagement>
<add address="*" maxconnection="65400" />
</connectionManagement>
</system.net>
aspnet.config
<system.web>
<applicationPool
maxConcurrentThreadsPerCPU="100" />
</system.web>
machine.config
<processModel
autoConfig="false"
memoryLimit="70"
maxWorkerThreads="100"
maxIoThreads="100" />
The configs are set for both x86 and x64
I have 32 gigs of mem and 4 physical cores, Windows 10.
The CPU doesn't go over 10% load when benching the 10 requests per second.
The above code uses WEB API, but of course I reproduce the same results using a HTTP Handler.
Has a maximum of 10 concurrent connections before an HTTP 403.9 error message is returned. HTTP 403.9 returns Access Forbidden: Too many users are connected.
ASP.NET Core apps should be designed to process many requests simultaneously. Asynchronous APIs allow a small pool of threads to handle thousands of concurrent requests by not waiting on blocking calls. Rather than waiting on a long-running synchronous task to complete, the thread can work on another request.
Updated at: 2022-09-20 GMT+08:00. The number of concurrent requests refers to the number of requests that the system can process simultaneously. When it comes to a website, concurrent requests refer to the requests from the visitors at the same time.
Here's a possible understanding. One to investigate, anyway.
Task.Delay() creates a new task, whose job is to pause. If I understand right, tasks often get dispatched to the .Net worker pool, which has a limited size. (You can check with ThreadPool.GetMaxThreads) When you try to put too much in, code will 'back up' as it waits for the thread pool to have space.
So let's say you have a thread pool of size 40. Once you've dispatched 40 tasks, all waiting a second, you max out the thread pool. Your bottleneck would be the tasks, gumming up the thread pool, not yielding space.
Normally, tasks that do expensive IO like database queries or file IO yield control while they wait for the work to be done. I wonder if Task.Delay is more 'clingy'.
Try swapping Task.Delay() for System.Threading.Thread.Sleep() and see if that changes anything.
I know for Windows 8 there is a maximum concurrent connection limit of 10 to stop people trying to use consumer OS's to run server workloads. I see no reason why Windows 10 would be any different.
http://blogs.iis.net/owscott/windows-8-iis-8-concurrent-requests-limit
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With