All I need is to know if something exists and how many times it exist. I will iterate over the existent things and query how much of that exists.
My implementation so far uses multiset
, I do as follow:
std::multiset<thing> a;
auto previous = a.end();
for( auto each = a.begin(); each != a.end(); ++each ) {
if( previous == a.end() || *previous != *each ) {
a.count(*each);
}
previous = each;
}
I have a vector of thing
s. But they repeat the value sometimes, I want to iterate over unique thing
s and for each unique do something. This "something" needs to know the amount of time this thing
appears on the vector.
The code I posted above is how I am resolving my problem right now, it does not seems to be the most elegant way to do what I want.
I am just following the Stackoverflow guidelines: I tell what is my problem, and I tell my (tried) solution.
If a sentence with a question mark is really needed, there you go: Is there a way to iterate over unique elements over a multiset
?
Three possible approaches:
std::unique
to create a temporary collection of unique values. This might make the code a little more readable, but less efficient.std::multiset::upper_bound
rather than increments: for( auto each = a.begin(); each != a.end(); each=a.upper_bound(*each))
- that way you don't need the if
check insider your loop, plus it is guaranteed to be logarithmic in size. Pretty cool (didn't know that before I looked it up). For the following suggestion, all credit goes to @MarkRansom: Using std::upper_bound
from <algorithm>
, you can specify a range in which to look for the upper bound. In your case, you already have a good candidate for the start of that range, so this method is likely to be more efficient, depending on the implementation in your standard library.map<thing, unsigned>
or even unordered_map<thing,unsigned>
where the unsigned
just keeps track of the number of equivalent thing
s you have. That implies rewriting your insertion/deletion code though.If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With