Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Limit the simultaneous requests served by the ASP.NET Web API

I'm using ASP.NET Web API 2.2 along with Owin to build a web service and I observed each call to the controller will be served by a separate thread running on the server side, that's nothing surprising and is the behavior I expected.

One issue I'm having now is that because the server side actions are very memory intense so if more than X number of users are calling in at the same time there is a good chance the server code will throw an out-of-memory exception.

Is it possible to set a global "maximum action count" so that Web Api can queue (not reject) the incoming calls and only proceed when there's an empty slot.

I can't run the web service in 64bit because some of the referenced libraries won't support that.

I also looked at libraries like https://github.com/stefanprodan/WebApiThrottle but it can only throttle based on the frequency of calls.

Thanks

like image 511
Godsent Avatar asked Nov 17 '15 15:11

Godsent


People also ask

How many concurrent requests can a Web API handle?

Number of concurrent requests exceeded the limit of 52. Client applications are not limited to sending requests individually in succession. The client may apply parallel programming patterns or various methods to send multiple requests simultaneously.

How many requests can API handle?

In the API Console, there is a similar quota referred to as Requests per 100 seconds per user. By default, it is set to 100 requests per 100 seconds per user and can be adjusted to a maximum value of 1,000. But the number of requests to the API is restricted to a maximum of 10 requests per second per user.

How many requests per second can asp net core handle?

7+ Million HTTP requests per second from a single server.

How does ASP net handle concurrent requests?

The answer is that when ASP.NET see two or more requests from the same user( by checking their SessionID) and session is marked with both Read and Write, then it will execute the first request and queue all other requests such that only one request will execute from the same user at the same time.


1 Answers

You could add a piece of OwinMiddleware along these lines (influenced by the WebApiThrottle you linked to):

public class MaxConccurrentMiddleware : OwinMiddleware
{
    private readonly int maxConcurrentRequests;
    private int currentRequestCount;

    public MaxConccurrentMiddleware(int maxConcurrentRequests)
    {
        this.maxConcurrentRequests = maxConcurrentRequests;
    }

    public override async Task Invoke(IOwinContext context)
    {
        try
        {
            if (Interlocked.Increment(ref currentRequestCount) > maxConcurrentRequests)
            {
                var response = context.Response;

                response.OnSendingHeaders(state =>
                {
                    var resp = (OwinResponse)state;
                    resp.StatusCode = 429; // 429 Too Many Requests
                }, response);

                return Task.FromResult(0);
            }

            await Next.Invoke(context);
        }
        finally
        {
            Interlocked.Decrement(ref currentRequestCount);
        }
    }
}
like image 90
Trevor Pilley Avatar answered Sep 23 '22 05:09

Trevor Pilley