I'm not used to C++, so bear with me...
Two bytes are read from a device and gets in a buffer. It is then to be printed.
The code below is supposed to return the string "0x204D" However, it returns "0x M" which in hex is 30 78 20 4d
So the hex is not decoded to ascii.
void vito_unit::decodeAsRaw(unsigned char *buffer, int bufferLen)
{
std::stringstream *decodedClearText;
decodedClearText = new std::stringstream;
*decodedClearText << "0x" << std::hex;
for (int i=0; i<bufferLen; i++) {
*decodedClearText << buffer[i];
}
setValue(decodedClearText->str());
}
How should it be done?
This has nothing to do with std::hex.
When you stream a [signed/unsigned] char, its ASCII representation is used, because that is usually what is expected of chars.
You can stream a number instead by converting it to int. Then the feature that renders numbers in hexadecimal notation (i.e. std::hex) will be triggered.
You should also fix that memory leak and unnecessary dynamic allocation:
void vito_unit::decodeAsRaw(unsigned char const* const buffer, int const bufferLen)
{
std::stringstream decodedClearText;
decodedClearText << "0x" << std::hex;
for (int i = 0; i < bufferLen; i++) {
decodedClearText << +buffer[i];
}
setValue(decodedClearText.str());
}
The unary "+" performs an integral promotion to int.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With