Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to configure concurrency in .NET Core Web API?

In the old WCF days, you had control over service concurrency via MaxConcurrentCalls setting. MaxConcurrentCalls defaulted to 16 concurrent calls but you could raise or lower that value based upon your needs.

How do you control server side concurrency in .NET Core Web API? We probably need to limit it in our case as too many concurrent requests can impede overall server performance.

like image 287
user2368632 Avatar asked Jun 06 '17 13:06

user2368632


People also ask

How does Web API implement concurrency?

How to handle concurrency in ASP.NET Core Web API. Create an empty project and update the Startup class to add services and middleware for MVC. Add a controller with GET and PUT to demonstrate concurrency. Send a GET request and observe the ETag header (using Postman).

What is concurrency in ASP.NET Core?

A concurrency conflict occurs when one user displays an entity's data in order to edit it, and then another user updates the same entity's data before the first user's change is written to the database.


1 Answers

ASP.NET Core application concurrency is handled by its web server. For example:

Kestrel

var host = new WebHostBuilder()
    .UseKestrel(options => options.ThreadCount = 8)

It is not recommended to set Kestrel thread count to a large value like 1K due to Kestrel async-based implementation.

More info: Is Kestrel using a single thread for processing requests like Node.js?

New Limits property has been introduced in ASP.NET Core 2.0 Preview 2.

You can now add limits for the following:

  1. Maximum Client Connections
  2. Maximum Request Body Size
  3. Maximum Request Body Data Rate

For example:

.UseKestrel(options =>
{
    options.Limits.MaxConcurrentConnections = 100;
}

IIS

When Kestrel runs behind a reverse proxy you could tune the proxy itself. For example, you could configure IIS application pool in web.config or in aspnet.config:

<configuration>
  <system.web>
    <applicationPool
        maxConcurrentRequestsPerCPU="5000"
        maxConcurrentThreadsPerCPU="0"
        requestQueueLimit="5000" />
  </system.web>
</configuration>

Of course Nginx and Apache have their own concurrency settings.

like image 158
Ilya Chumakov Avatar answered Sep 17 '22 12:09

Ilya Chumakov