Is there a quick, built in way, using C# to convert an array of three bytes representing a 24bit (little endian, two's complement) value to an int? How should I go about this?
Thanks!
Given this explanation, it's clear that endianness doesn't matter with C-style strings. Endianness does matter when you use a type cast that depends on a certain endian being in use.
On little endian platforms, the value 1 is stored in one byte as 01 (the same as big endian), in two bytes as 01 00, and in four bytes as 01 00 00 00. If an integer is negative, the "two's complement" representation is used. The high-order bit of the most significant byte of the integer will be set on.
If it is little-endian, it would be stored as “01 00 00 00”. The program checks the first byte by dereferencing the cptr pointer. If it equals to 0, it means the processor is big-endian(“00 00 00 01”), If it equals to 1, it means the processor is little-endian (“01 00 00 00”).
I'm surprised no one has suggested BitConverter
yet. Assuming you have the three bytes in separate variables:
var data = new byte[]
{
byte0 & 0x80 == 0 ? 0 : 0xFF, byte0, byte1, byte2
};
return BitConverter.ToInt32(data, 0);
or alternatively:
var data = new byte[] { byte0, byte1, byte2, 0x00 };
return BitConverter.ToInt32(data, 0) >> 8;
int converted = ((bytes[2] << 24) | (bytes[1] << 16) | (bytes[0] << 8)) >> 8;
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With