I'm trying to figure out what data type to use... Basically I want a FIFO queue that is thread-safe and will automatically throw out old enough items once it gets to a pre-specified limit.
Well, actually, maybe more of a list, because I don't want the whole concept of pushing onto the queue and popping an item off the queue at which point it's no longer available.
The use case is basically for a playlist where I would have up to 5 upcoming items, the currently playing item, and then about 20 items that have already played. Hence, why I guess it cannot be a queue, I would be accessing one of the items in the middle as the "current" item. And I would rather not have to manually manage throwing away old items when the list gets to big... obviously I could write this all myself, but I don't want to reinvent the wheel if this already exists for C#.
Any ideas of what I could use?
Synchronization on writing to the queue is then handled within the FIFO queue process itself. As a rule, unless you have documentation in your hand that declares operation Foo to be thread-safe/atomic/whatever, then assume it isn't safe. Having "read somewhere ..."
Using std::unique_ptr as FIFO type is good practice if your ITEM is big in size. Example usage: FIFO<std::unique_ptr<float>, FIFOdumpTypes::DumpNewItem> fifo (5); std::unique_ptr<float> temp = std::make_unique<float> (2.1); fifo.push (temp); int size = fifo.size (); fifo.pull (temp);
Thread Safe List With the ConcurrentQueue Class in C The ConcurrentQueue class is used to create a thread-safe queue data structure in C#. The ConcurrentQueue works on the principle of first in, first out, just like the List in C#. The ConcurrentQueue object can be used instead of the List object to create a thread-safe data structure.
It's atomic only inasmuch as how you write your FIFO queue/structure. Wrapping a mutex or some other form of lock around the FIFO queue is probably the easiest way to guarantee concurrency and atomicity for the environment you describe.
In the Framework there is something almost having the functionality you want - the ConcurrentQueue
. It is thread-safe queue with most operations being implemented lock-free, so it is very fast.
The only function is does not have is the "limit" with automatic "throw-away"...
But that can be easily added - just create you own class containing a private ConcurrentQueue
and implement the "throw-away part" in your public enqueue method by Dequeuing/throwing away till your limit is satisfied before Enqueuing the new element.
EDIT - as per comment:
One option would be to make the second "Queue" an ObservableCollection
- though not inherently thread-safe (beware) this would be easily bind in WPF...
Another would be to make your class implement the ObservableCollection
interface (which consists of IList<T>, ICollection<T>, IEnumerable<T>, IList, ICollection, IEnumerable, INotifyCollectionChanged, INotifyPropertyChanged
) accordingly - that sounds alot, but most of these you get easily implemented by relaying to the internal ConcurrentQueue
so there is not much real code to write...
See http://msdn.microsoft.com/en-us/library/ms752347.aspx
You might try new ReplaySubject<T>(int count)
from Rx, which buffers the last count
objects from the stream of observed events.
http://msdn.microsoft.com/en-us/library/hh229429.aspx
If you need a more conventional programming model (Rx is a little out there) then maybe try TPL DataFlow BroadcastBlock<T>
. Broadcast is named as in TV broadcast - if a frame isn't 'processed' fast enough it is dropped, so processing stays relevant with respect 'live' frame.
http://msdn.microsoft.com/en-us/library/hh160447.aspx
UPDATE: ReplaySubject replays the first X, it is not a FIFO queue it is a 'First X List'.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With