I believe a pretty common scenario is to have a queue of items that should be processed N at a time.
For instance.. if we have 23 items
and should process 10
at a time, it would be like:
Process batch of 10
Process batch of 10
Process batch of 3
I can solve this problem in a variaty of ways. My question is: Does the .NET framework provide any class designed specifically to address this scenario? The Queue
class would be perfect but it doesn't allow for dequeuing multiple items at once.
You could create an extension method on Queue<T>
:
public static class QueueExtensions
{
public static IEnumerable<T> DequeueChunk<T>(this Queue<T> queue, int chunkSize)
{
for (int i = 0; i < chunkSize && queue.Count > 0; i++)
{
yield return queue.Dequeue();
}
}
}
Usage:
var q = new Queue<char>();
q.DequeueChunk(10) // first 10 items
q.DequeueChunk(10) // next 10 items
Example: https://dotnetfiddle.net/OTcIZX
You can achieve this in .NET with Linq by using the Enumerable.Range()
method along with Select()
extension method:
var chunk = Enumerable.Range(0, chuckCount).Select(i => queue.Dequeue()).ToList();
This works by generating an enumerable of integers then for each integer in the new enumerable it dequeues an item out of your queue. Ensure the operation is done immediately by invoking ToList()
.
TPL Dataflow library offers BatchBlock < T > that groups input sequence of messages into chunks of desired size.
var bb = new BatchBlock<int>(10);
var ab = new ActionBlock<int[]>((Action<int[]>)chunk=>HandleChunk(chunk));
bb.LinkTo(ab, new DataflowLinkOptions(){PropogateCompletion = true});
for(int i = 0; i < 23; ++i)
{
bb.Post(i);
}
bb.Complete();
ab.Completion.Wait();
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With