Giving the following code:
#include <iterator>
#include <vector>
int main()
{
char arr[3] = { 1,2,3 };
std::vector<char> vec = { 1,2,3 };
std::vector<int> vec_one(std::begin(arr), std::end(arr));
std::vector<int> vec_two(vec.begin(), vec.end());
}
Are the initializations for vec_one
and vec_two
undefined, implementation defined or defined according to normal type conversion rules?
What if the char
and int
types are swapped?
They are all fine, subject to the same rules applying when converting a char
to an int
(so no concerns there) and int
to char
, which will again be subject to the normal rules: the int
must be small enough to fit into a char
if char
is signed
(otherwise the behaviour is undefined), and well-defined wrap-around behaviour if char
is unsigned
.
This is well defined code in all but one case. int
is required to have at least the same size as char
and be able to store at least what a 16 bit twos compliment integer can store. So when sizeof(char) < sizeof(int)
the behavior is well defined as you int
can store every value char
can . If sizeof(char) ==
sizeof(int)and
charis an alias to
unsigned charthen you could overflow the
int` which is undefined behavior.
The reverse case also has undefined behavior. If char
is an alias to signed char
and sizeof(int) > sizeof(char)
then you could overflow converting the int
to a char
which is signed integer overflow and is undefined behavior. If char
is an alias to unsigned char
though it will never be undefined behavior.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With