I have situation, where I have to receive requests in a Web API method, queue those request and then send the bulk to a database (Solr instance).
I am not really sure how do I maintain a batch of requests from multiple sources. For now I am writing each request data in json format to a file on disk, Later I will have a windows service, go through the folder read all files , update the database and delete those files.
Here is what I am doing in my Web API
public void Post(LogEntry value)
{
value.EventID = Guid.NewGuid();
value.ServerTime = DateTime.UtcNow;
string json = JsonConvert.SerializeObject(value);
using(StreamWriter sw = new StreamWriter(value.EventID.ToString()))
{
sw.Write(json);
}
}
(Here EventID
is GUID)
This process doesn't look right, there must be a way to maintain a queue of request, but I am not really sure how to maintain a queue during multiple requests.
The reason I am doing that is, insertion in batches in solr instance is faster than inserting a single record through SolrNet. I am expecting to get at least 100 requests each second on the Web API. I want to create a batch of 1000 request and update the solr instance every 10 seconds. Please don't think that I need code, just need to know what strategy should I adopt to maintain a queue of request / state.
This is a very good scenario to make use of MSMQ. For each request just post the item to a MSMQ queue. Either in same webapp or any other app just read multiple items from the queue and post it to solr. Regardless of your app crashing or getting recycled, MSMQ will hold your data safely for you to retrieve it later.
MSMQ is robust, reliable and scalable. It is a perfect fit for your problem.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With