I have a vector of pairs of integers that looks somehow like that:
(0, 1)
(1, 9)
(2, 3)
(6, 1)
(4, 0)
I want to extract unique elements from there, so that the result looks as follows:0, 1, 9, 2, 3, 6, 4
(basically just all numbers without duplicates)
At the moment I'm doing it like that:
std::vector<int> getElements(std::vector<std::pair<int, int>> S) {
std::vector<int> V;
for (std::vector<std::pair<int, int>>::iterator i = S.begin(); i != S.end(); i++) {
if (std::find(V.begin(), V.end(), i->first) == V.end()) {
V.push_back(i->first);
}
if (std::find(V.begin(), V.end(), i->second) == V.end()) {
V.push_back(i->second);
}
}
return V;
}
Is there any more efficient way to do it?
Your current solution is O(n^2)
. You can reduce the linear-scan for already seen elements to an amortized O(1)
by using std::unordered_set
to store the already seen numbers; This will improve your runtime to O(n)
.
Here is an improved algorithm:
std::vector<int> getElements(std::vector<std::pair<int, int>> S) {
std::unordered_set<int> ss;
std::for_each(S.begin(), S.end(), [&ss](const auto& p) {
ss.insert(p.first);
ss.insert(p.second);
});
return std::vector<int>(ss.begin(), ss.end());
}
See an example Live On Coliru
Is there any more efficient way to do it?
Yes, there is. std::find
has O(n) complexity for vector, so repeating it for each element gives you O(n*n) complexity.
A simple alternative is to add every element into std::set
. The complexity of building the set is O(n log n).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With