In the old WCF days, you had control over service concurrency via MaxConcurrentCalls
setting. MaxConcurrentCalls
defaulted to 16 concurrent calls but you could raise or lower that value based upon your needs.
How do you control server side concurrency in .NET Core Web API? We probably need to limit it in our case as too many concurrent requests can impede overall server performance.
How to handle concurrency in ASP.NET Core Web API. Create an empty project and update the Startup class to add services and middleware for MVC. Add a controller with GET and PUT to demonstrate concurrency. Send a GET request and observe the ETag header (using Postman).
A concurrency conflict occurs when one user displays an entity's data in order to edit it, and then another user updates the same entity's data before the first user's change is written to the database.
ASP.NET Core application concurrency is handled by its web server. For example:
var host = new WebHostBuilder()
.UseKestrel(options => options.ThreadCount = 8)
It is not recommended to set Kestrel thread count to a large value like 1K
due to Kestrel async-based implementation.
More info: Is Kestrel using a single thread for processing requests like Node.js?
New Limits
property has been introduced in ASP.NET Core 2.0 Preview 2.
You can now add limits for the following:
- Maximum Client Connections
- Maximum Request Body Size
- Maximum Request Body Data Rate
For example:
.UseKestrel(options =>
{
options.Limits.MaxConcurrentConnections = 100;
}
When Kestrel runs behind a reverse proxy you could tune the proxy itself. For example, you could configure IIS application pool in web.config
or in aspnet.config
:
<configuration>
<system.web>
<applicationPool
maxConcurrentRequestsPerCPU="5000"
maxConcurrentThreadsPerCPU="0"
requestQueueLimit="5000" />
</system.web>
</configuration>
Of course Nginx and Apache have their own concurrency settings.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With