Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Increase Http Runtime MaxRequestLength from C# code

Tags:

asp.net

How can I increase

from my C# code ? I can't do this in Web.config, My application is created to deploy web application in IIS.

like image 512
BreakHead Avatar asked Apr 25 '11 13:04

BreakHead


2 Answers

Take a look at http://bytes.com/topic/asp-net/answers/346534-how-i-can-get-httpruntime-section-page

There's how you get access to an instance of HttpRuntimeSection. Then modify the property MaxRequestLength.

like image 72
Onkelborg Avatar answered Sep 20 '22 04:09

Onkelborg


An alternative to increasing the max request length is to create an IHttpModule implementation. In the BeginRequest handler, grab the HttpWorkerRequest to process it entirely in your own code, rather than letting the default implementation handle it.

Here is a basic implementation that will handle any request posted to any file called "dropbox.aspx" (in any directory, whether it exists or not):

using System;
using System.Collections.Generic;
using System.Linq;
using System.Web;

namespace Example
{
    public class FileUploadModule: IHttpModule
    {
        #region IHttpModule Members

        public void Dispose() {}

        public void Init(HttpApplication context)
        {
            context.BeginRequest += new EventHandler(context_BeginRequest);
        }

        #endregion

        void context_BeginRequest(object sender, EventArgs e)
        {
            HttpApplication application = (HttpApplication)sender;
            HttpContext context = application.Context;
            string filePath = context.Request.FilePath;
            string fileName = VirtualPathUtility.GetFileName( filePath );
            string fileExtension = VirtualPathUtility.GetExtension(filePath);

            if (fileName == "dropbox.aspx")
            {
                IServiceProvider provider = (IServiceProvider)context;
                HttpWorkerRequest wr = (HttpWorkerRequest)provider.GetService(typeof(HttpWorkerRequest));

                //HANDLE REQUEST HERE
                //Grab data from HttpWorkerRequest instance, as reflected in HttpRequest.GetEntireRawContent method.

                application.CompleteRequest(); //bypasses all other modules and ends request immediately
            }
        }
    }
}

You could use something like that, for example, if you're implementing a file uploader, and you want to process the multi-part content stream as it's received, so you can perform authentication based on posted form fields and, more importantly, cancel the request on the server-side before you even receive any file data. That can save a lot of time if you can determine early on in the stream that the upload is not authorized or the file will be too big or exceed the user's disk quota for the dropbox.

This is impossible to do with the default implementation, because trying to access the Form property of the HttpRequest will cause it to try to receive the entire request stream, complete with MaxRequestLength checks. The HttpRequest object has a method called "GetEntireRawContent" which is called as soon as access to the content is needed. That method starts with the following code:

HttpRuntimeSection httpRuntime = RuntimeConfig.GetConfig(this._context).HttpRuntime;
int maxRequestLengthBytes = httpRuntime.MaxRequestLengthBytes;
if (this.ContentLength > maxRequestLengthBytes)
{
    if (!(this._wr is IIS7WorkerRequest))
    {
        this.Response.CloseConnectionAfterError();
    }
    throw new HttpException(SR.GetString("Max_request_length_exceeded"), null, 0xbbc);
}

The point is that you'll be skipping that code and implementing your own custom content length check instead. If you use Reflector to look at the rest of "GetEntireRawContent" to use it as a model implementation, you'll see that it basically does the following: calls GetPreloadedEntityBody, checks if there's more to load by calling IsEntireEntityBodyIsPreloaded, and finally loops through calls to ReadEntityBody to get the rest of the data. The data read by GetPreloadedEntityBody and ReadEntityBody are dumped into a specialized stream, which automatically uses a temporary file as a backing store once it crosses a size threshold.

A basic implementation would look like this:

MemoryStream request_content = new MemoryStream();
int bytesRemaining = wr.GetTotalEntityBodyLength() - wr.GetPreloadedEntityBodyLength();
byte[] preloaded_data = wr.GetPreloadedEntityBody();
if (preloaded_data != null)
    request_content.Write( preloaded_data, 0, preloaded_data.Length );
if (!wr.IsEntireEntityBodyIsPreloaded()) //not a type-o, they use "Is" redundantly in the 
{
    int BUFFER_SIZE = 0x2000; //8K buffer or whatever
    byte[] buffer = new byte[BUFFER_SIZE];
    while (bytesRemaining > 0)
    {
        bytesRead = wr.ReadEntityBody(buffer, Math.Min( bytesRemaining, BUFFER_SIZE )); //Read another set of bytes
        bytesRemaining -= bytesRead; // Update the bytes remaining
        request_content.Write( buffer, 0, bytesRead ); // Write the chunks to the backing store (memory stream or whatever you want)
    }
    if (bytesRead == 0) //failure to read or nothing left to read
        break;
}

At that point, you'll have your entire request in a MemoryStream. However, rather than download the entire request like that, what I've done is offload that "bytesRemaining" loop into a class with a "ReadEnough( int max_index )" method that is called on demand from a specialized MemoryStream that "loads enough" into the stream to access the byte being accessed.

Ultimately, that architecture allows me to send the request directly to a parser that reads from the memory stream, and the memory stream automatically loads more data from the worker request as needed. I've also implemented events so that as each element of the multi-part content stream is parsed, it fires events when each new part is identified and when each part is completely received.

like image 20
Triynko Avatar answered Sep 22 '22 04:09

Triynko