I have a custom container that I am making that uses the STL list container as its internal structure. One of the items I am adding to this container is equal_range.
The code for my equal_range is here:
template <typename T> typename square_list<T>::iterator_pair square_list<T>::equal_range(key_type const& key) {
auto range = std::equal_range(data_.begin(), data_.end(), key);
for (square_list<T>::iterator it = range.first; it != range.second;) {
if (range.first == data_.end() && range.second == data_.end())
return std::make_pair(data_.end(), data_.end());
++it;
}
return range;
}
I have some unit tests I am running against this implementation and everything is working, except for if the key value is not within the data structure. It's causing issues and .first and .second are not returning equal to the end of data_.
What do I need to change in order to have range.first and range.second be equal to the end of the std::list<T> data_ if the key value is not within the list?
Here's the code for the unit test as well:
BOOST_AUTO_TEST_CASE(ut_equal_range) {
vector<int> v{ 1, 1, 2, 2, 3, 4, 5, 6, 7, 8, 8, 8 };
square_list<char> sql(v.begin(), v.end());
auto res = sql.equal_range(0);
BOOST_CHECK(res.first == sql.cend());
BOOST_CHECK(res.second == sql.cend());
}
std::equal_range will return a range of size 0 where key would be if it should fail to be in data_. So, if you would like to return a pair of iterators to data_.end() in that case, you should check if range.first == range.second, and return the pair of end iterators accordingly.
It's worth noting the restrictions on the input to std::equal_range: http://en.cppreference.com/w/cpp/algorithm/equal_range
In short, the input to equal_range must be partitioned according to the key, meaning that some two sub-ranges must exist where one is composed of values all less than the key being searched for, and one composed of values all greater. A sorted range already has this guarantee.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With