Is C# ever Endian sensitive, for example, will code such as this:
int a = 1234567;
short b = *(short*)&i;
always assign the same value to b. If so, what value will it be?
If not, what good ways are there to deal with endianness if code with pointers in?
C++ was developed by Bjarne Stroustrup in 1979. C does no support polymorphism, encapsulation, and inheritance which means that C does not support object oriented programming. C++ supports polymorphism, encapsulation, and inheritance because it is an object oriented programming language. C is (mostly) a subset of C++.
C programming language is a machine-independent programming language that is mainly used to create many types of applications and operating systems such as Windows, and other complicated programs such as the Oracle database, Git, Python interpreter, and games and is considered a programming foundation in the process of ...
Compared to C, C++ has significantly more libraries and functions to use. If you're working with complex software, C++ is a better fit because you have more libraries to rely on. Thinking practically, having knowledge of C++ is often a requirement for a variety of programming roles.
<> in some languages means "does not equal". But in c, the operator is != . Also note the difference between logical AND ( && ) and bitwise AND ( & ). You should use the logical operators for multiple criteria in a conditional statement.
C# doesn't define the endianness. In reality, yes it will probably always be little-endian (IIRC even on IA64, but I haven't checked), but you should ideally check BitConverter.IsLittleEndian
if endianness is important - or just use bit-shifting etc rather than direct memory access.
To quote a few lines from protobuf-net (a build not yet committed):
WriteInt64(*(long*)&value);
if (!BitConverter.IsLittleEndian)
{ // not fully tested, but this *should* work
Reverse(ioBuffer, ioIndex - 8, 8);
}
i.e. it checks the endianness and does a flip if necessary.
Yes, I believe that code is endian-sensitive. The value of b
will be the least-significant bytes on a little-endian processor, and the most-significant bytes on a big-endian processor. To make this simpler to see, let's switch to hex:
using System;
class Test
{
unsafe static void Main()
{
int a = 0x12345678;
short b = *(short*)&a;
Console.WriteLine(b.ToString("x"));
}
}
On my x86 box, that prints "5678" showing that the least-significant bytes were at the "start" of the vaue of a
. If you run the same code on a processor running in big-endian mode (probably under Mono) I'd expect it to print "1234".
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With