I have a fairly bog standard .net MVC 4 Web API application.
public class LogsController : ApiController
{
public HttpResponseMessage PostLog(List<LogDto> logs)
{
if (logs != null && logs.Any())
{
var goodLogs = new List<Log>();
var badLogs = new List<LogBad>();
foreach (var logDto in logs)
{
if (logDto.IsValid())
{
goodLogs.Add(logDto.ToLog());
}
else
{
badLogs.Add(logDto.ToLogBad());
}
}
if (goodLogs.Any())
{
_logsRepo.Save(goodLogs);
}
if(badLogs.Any())
{
_logsBadRepo.Save(badLogs);
}
}
return new HttpResponseMessage(HttpStatusCode.OK);
}
}
This all work fine, I have devices that are able to send me their logs and it works well. However now we are starting to have concerns about the size of the data being transferred, and we want to have a look at accepting post that have been compressed using GZIP?
How would I go about do this? Is it setting in IIS or could I user Action Filters?
EDIT 1
Following up from Filip's answer my thinking is that I need to intercept the processing of the request before it gets to my controller. If i can catch the request before the Web api framework attempts to parse the body of the request into my business object, which fails because the body of the request is still compressed. Then I can decompress the body of the request and then pass the request back into the processing chain, and hopefully the Web Api framework will be able to parse the (decompressed) body into my business objects.
It looks Like using the DelagatingHandler is the way to go. It allows me access to the request during the processing, but before my controller. So I tried the following?
public class gZipHandler : DelegatingHandler
{
protected override System.Threading.Tasks.Task<HttpResponseMessage> SendAsync(HttpRequestMessage request, System.Threading.CancellationToken cancellationToken)
{
string encodingType = request.Headers.AcceptEncoding.First().Value;
request.Content = new DeCompressedContent(request.Content, encodingType);
return base.SendAsync(request, cancellationToken);
}
}
public class DeCompressedContent : HttpContent
{
private HttpContent originalContent;
private string encodingType;
public DeCompressedContent(HttpContent content, string encodType)
{
originalContent = content;
encodingType = encodType;
}
protected override bool TryComputeLength(out long length)
{
length = -1;
return false;
}
protected override Task<Stream> CreateContentReadStreamAsync()
{
return base.CreateContentReadStreamAsync();
}
protected override Task SerializeToStreamAsync(Stream stream, TransportContext context)
{
Stream compressedStream = null;
if (encodingType == "gzip")
{
compressedStream = new GZipStream(stream, CompressionMode.Decompress, leaveOpen: true);
}
return originalContent.CopyToAsync(compressedStream).ContinueWith(tsk =>
{
if (compressedStream != null)
{
compressedStream.Dispose();
}
});
}
}
}
This seems to be working ok. The SendAsync method is being called before my controller and the constructor for the DecompressedContent is being called. However the SerializeToStreamAsync is never being called so I added the CreateContentReadStreamAsync to see if that's where the decompressing should be happening, but that's not being called either.
I fell like I am close to the solution, but just need a little bit extra to get it over the line.
In Web API, content negotiation is performed by the runtime (at the server side) to determine the media type formatter to be used based to return the response for an incoming request from the client side. Content negotiation is centered on Media type and Media type formatter.
Create a Resource (HTTP POST) In this method set base address of Asp.Net Web API and sets the accept header to application/json that tells the server to send data in JSON Format. PostAsJsonAsyn:This method serializes object into JSON format and send POST request to. After that this method return Response object.
Hence, select POST method from the list of HTTP methods. We try and pass the Web API URL in the URL tab in order to make sure you need to first run your project and pass localhost URL. Go to body panel and select row. Under the row, write the JSON script of your data which you want to save in the database.
I had the same requirement to POST gzipped data to a .NET web api controller. I came up with this solution:
public class GZipToJsonHandler : DelegatingHandler
{
protected override Task<HttpResponseMessage> SendAsync(HttpRequestMessage request,
CancellationToken cancellationToken)
{
// Handle only if content type is 'application/gzip'
if (request.Content.Headers.ContentType == null ||
request.Content.Headers.ContentType.MediaType != "application/gzip")
{
return base.SendAsync(request, cancellationToken);
}
// Read in the input stream, then decompress in to the outputstream.
// Doing this asynronously, but not really required at this point
// since we end up waiting on it right after this.
Stream outputStream = new MemoryStream();
Task task = request.Content.ReadAsStreamAsync().ContinueWith(t =>
{
Stream inputStream = t.Result;
var gzipStream = new GZipStream(inputStream, CompressionMode.Decompress);
gzipStream.CopyTo(outputStream);
gzipStream.Dispose();
outputStream.Seek(0, SeekOrigin.Begin);
});
// Wait for inputstream and decompression to complete. Would be nice
// to not block here and work async when ready instead, but I couldn't
// figure out how to do it in context of a DelegatingHandler.
task.Wait();
// This next section is the key...
// Save the original content
HttpContent origContent = request.Content;
// Replace request content with the newly decompressed stream
request.Content = new StreamContent(outputStream);
// Copy all headers from original content in to new one
foreach (var header in origContent.Headers)
{
request.Content.Headers.Add(header.Key, header.Value);
}
// Replace the original content-type with content type
// of decompressed data. In our case, we can assume application/json. A
// more generic and reuseable handler would need some other
// way to differentiate the decompressed content type.
request.Content.Headers.Remove("Content-Type");
request.Content.Headers.Add("Content-Type", "application/json");
return base.SendAsync(request, cancellationToken);
}
}
Using this approach, the existing controller, which normally works with JSON content and automatic model binding, continued to work without any changes.
I'm not sure why the other answer was accepted. It provides a solution for handling the responses (which is common), but not requests (which is uncommon). The Accept-Encoding header is used to specify acceptable response encodings, and is not related to request encodings.
I believe the correct answer is Kaliatech's and I would have left this as a comment and voted his up is I had enough reputation points, since I think his is basically correct.
However, my situation called for the need to look at the encoding type type rather than the content type. Using this approach the calling system can still specify that the content type is json/xml/etc in the content type, but specify that the data is encoded using gzip or potentially another encoding/compression mechanism. This prevented me from needing to change the content type after decoding the input and allows any content type information to flow through in its original state.
Here's the code. Again, 99% of this is Kaliatech's answer including the comments, so please vote his post up if this is useful.
public class CompressedRequestHandler : DelegatingHandler
{
protected override System.Threading.Tasks.Task<HttpResponseMessage> SendAsync(HttpRequestMessage request, System.Threading.CancellationToken cancellationToken)
{
if (IsRequetCompressed(request))
{
request.Content = DecompressRequestContent(request);
}
return base.SendAsync(request, cancellationToken);
}
private bool IsRequetCompressed(HttpRequestMessage request)
{
if (request.Content.Headers.ContentEncoding != null &&
request.Content.Headers.ContentEncoding.Contains("gzip"))
{
return true;
}
return false;
}
private HttpContent DecompressRequestContent(HttpRequestMessage request)
{
// Read in the input stream, then decompress in to the outputstream.
// Doing this asynronously, but not really required at this point
// since we end up waiting on it right after this.
Stream outputStream = new MemoryStream();
Task task = request.Content.ReadAsStreamAsync().ContinueWith(t =>
{
Stream inputStream = t.Result;
var gzipStream = new GZipStream(inputStream, CompressionMode.Decompress);
gzipStream.CopyTo(outputStream);
gzipStream.Dispose();
outputStream.Seek(0, SeekOrigin.Begin);
});
// Wait for inputstream and decompression to complete. Would be nice
// to not block here and work async when ready instead, but I couldn't
// figure out how to do it in context of a DelegatingHandler.
task.Wait();
// Save the original content
HttpContent origContent = request.Content;
// Replace request content with the newly decompressed stream
HttpContent newContent = new StreamContent(outputStream);
// Copy all headers from original content in to new one
foreach (var header in origContent.Headers)
{
newContent.Headers.Add(header.Key, header.Value);
}
return newContent;
}
I then registered this handler globally, which could be a dicey proposition if you are vulnerable to DoS attacks, but our service is locked down, so it works for us
GlobalConfiguration.Configuration.MessageHandlers.Add(new CompressedRequestHandler());
While Web API doesn't support Accept-Encoding
header out of the box, but Kiran has a terrific blog post on how to do that - http://blogs.msdn.com/b/kiranchalla/archive/2012/09/04/handling-compression-accept-encoding-sample.aspx - using a custom MessageHandler
If you implement his solution, all you need to do is issue a request with Accept-Encoding: gzip
or Accept-Encoding: deflate
header and the Web API response will be compressed in the message handler for you.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With