I just observed a weird phenomenon in C#/.NET.
I created this minimal example to demonstrate:
if (new sbyte[5] is byte[]) { throw new ApplicationException("Impossible!"); } object o = new sbyte[5]; if (o is byte[]) { throw new ApplicationException("Why???"); }
This will throw "Why???", but not "Impossible!". It works for all arrays of integral types of the same size. Can someone explain this to me? I'm confused. I'm using .NET 4 by the way.
P.S.: I know that I can get the expected result by using o.GetType() == typeof(byte[])
.
The ' |= ' symbol is the bitwise OR assignment operator.
In mathematics, the tilde often represents approximation, especially when used in duplicate, and is sometimes called the "equivalency sign." In regular expressions, the tilde is used as an operator in pattern matching, and in C programming, it is used as a bitwise operator representing a unary negation (i.e., "bitwise ...
C operators are one of the features in C which has symbols that can be used to perform mathematical, relational, bitwise, conditional, or logical manipulations. The C programming language has a lot of built-in operators to perform various tasks as per the need of the program.
In C/C++, the # sign marks preprocessor directives. If you're not familiar with the preprocessor, it works as part of the compilation process, handling includes, macros, and more.
The CLR rules of casting specify that this is possible. The C# rules say it is not possible. The C# team consciously decided that they would tolerate this deviation from the spec for various reasons.
Why does the CLR allow this? Probably because they can conveniently implement it. byte
and sbyte
have the same binary representation so you can "treat" a byte[]
as an sbyte[]
without violating memory safety.
The same trick works for other primitive types with the same memory layout.
Funny, I got bitten by that in my question, Why does this Linq Cast Fail when using ToList?
Jon Skeet (of course) explains that my problem is the C# compiler, for whatever reason, thinks they could never be the same thing, and helpfully optimizes it to false. However, the CLR does let this happen. The cast to object throws off the compiler optimization, so it goes through the CLR.
The relevant part from his answer:
Even though in C# you can't cast a byte[] to an sbyte[] directly, the CLR allows it:
var foo = new byte[] {246, 127}; // This produces a warning at compile-time, and the C# compiler "optimizes" // to the constant "false" Console.WriteLine(foo is sbyte[]); object x = foo; // Using object fools the C# compiler into really consulting the CLR... which // allows the conversion, so this prints True Console.WriteLine(x is sbyte[]);
Joel asked an interesting question in the comments, "Is this behavior controlled by the Optimize Code flag (/o
to the compiler)?"
Given this code:
static void Main(string[] args) { sbyte[] baz = new sbyte[0]; Console.WriteLine(baz is byte[]); }
And compiled with csc /o- Code.cs
(don't optimize), it appears that the compiler optimizes it anyway. The resulting IL:
IL_0000: nop IL_0001: ldc.i4.0 IL_0002: newarr [mscorlib]System.SByte IL_0007: stloc.0 IL_0008: ldc.i4.0 IL_0009: call void [mscorlib]System.Console::WriteLine(bool) IL_000e: nop IL_000f: ret
IL_0008 loads 0 (false) directly onto the stack, then calls WriteLine
on IL_0009. So no, the optimization flag does not make a difference. If the CLR were to be consulted, the isinst
instruction would get used. It would probably look something like this starting from IL_0008:
IL_0008: ldloc.0 IL_0009: isinst uint8[] IL_000e: ldnull IL_000f: cgt.un IL_0011: call void [mscorlib]System.Console::WriteLine(bool)
I would agree with the optimizer's behavior. The optimization flag should not change the behavior of your program.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With