I have a .NET Windows Service which spawns a thread that basically just acts as an HttpListener
. This is working fine in synchronous mode example...
private void CreateLListener()
{
HttpListenerContext context = null;
HttpListener listener = new HttpListener();
bool listen = true;
while(listen)
{
try
{
context = listener.GetContext();
}
catch (...)
{
listen = false;
}
// process request and make response
}
}
The problem I now have is I need this to work with multiple requests and have them responded to simultaneously or at least in an overlapped way.
To explain further - the client is a media player app which starts by making a request for a media file with the request header property Range bytes=0-
. As far as I can tell it does this to work out what the media container is.
After it has read a 'chunk' (or if it has read enough to ascertain media type) it then makes another request (from a different client socket number) with Range bytes=X-Y
. In this case Y is the Content-Length returned in the first response and X is 250000 bytes less than that (discovered using IIS as a test). At this stage it is getting the last 'chunk' to see if it can get a media time-stamp to gauge length.
Having read that, it makes another request with Range bytes=0-
(from another socket number) to start streaming the media file properly.
At any time though, if the user of the client performs a 'skip' operation it then sends another request (from yet another socket number) with Range bytes=Z-
where Z is the position to jump to in the media file.
I'm not very good with HTTP stuff but as far as I can tell I need to use multiple threads to handle each request/response while allowing the original HttpListener
to return to listening. I've done plenty of searching but can't find a model which seems to fit.
EDIT:
Acknowledgement and gratitude to Rick Strahl for the following example which I was able to adapt to suit my needs...
Add a Web Server to your .NET 2.0 app with a few lines of code
If you need a more simple alternative to BeginGetContext, you can merely queue jobs in ThreadPool, instead of executing them on the main thread. Like such:
private void CreateLListener() {
//....
while(true) {
ThreadPool.QueueUserWorkItem(Process, listener.GetContext());
}
}
void Process(object o) {
var context = o as HttpListenerContext;
// process request and make response
}
If you're here from the future and trying to handle multiple concurrent requests with a single thread using async/await..
public async Task Listen(string prefix, int maxConcurrentRequests, CancellationToken token)
{
HttpListener listener = new HttpListener();
listener.Prefixes.Add(prefix);
listener.Start();
var requests = new HashSet<Task>();
for(int i=0; i < maxConcurrentRequests; i++)
requests.Add(listener.GetContextAsync());
while (!token.IsCancellationRequested)
{
Task t = await Task.WhenAny(requests);
requests.Remove(t);
if (t is Task<HttpListenerContext>)
{
var context = (t as Task<HttpListenerContext>).Result;
requests.Add(ProcessRequestAsync(context));
requests.Add(listener.GetContextAsync());
}
}
}
public async Task ProcessRequestAsync(HttpListenerContext context)
{
...do stuff...
}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With