This is mostly a language lawyer kind of question, I doubt that most implementations would bother, especially since it would probably increase compile time for every user.
That being said: If some implementation of std::set was implemented using bitset for each instance and static array of 256 values that is shared(it is safe since keys are const) would that be legal according to the (if edition matters then assume C++20) standard?
I see no constraint that would forbid you to make a specialized implementation, as long as you respect the standard specifications in section [set]
.
For set<char>
or set<uint8_t>
you'd need 32 octets to store the 256 bits representing the potential members, with the advantage of very fast set operations. For set<int>
you'd consume too much memory and this would only be justified IMHO if you'd have very populated sets.
This being said, there are some chalenges to overcome:
extract()
member that is supposed to return a value of the (unspecified) specialized type node_type
. Not sure what this requirement implies, but I think it could be solved in a similar manner than the iterator issue above. EDIT: Made a mistake. Proxy iterators are not allowed in C++17.
I think it is not possible. First: That implementation will hold all complexity guarantees and will for most of them be better than the requirements.
Second: You are not allowed to create proxy-iterator when using standard container since they had to return real references at some points. (The std::bitset mentioned by @Christophe is not a container and has a proxy-reference in its definition. The std::vector < bool > is a famous example for breaking guarantees.). So it is not possible to use that implementation.
Edit: Tanks to @NicolBolas for pointing that proxy iterators are still not allowed.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With