I have a piece of code, which outputs different results, depending on the C# compiler and the runtime.
The code in question is:
using System;
public class Program {
public static void Main() {
Console.WriteLine(string.Compare("alo\0alo\0", "alo\0alo\0\0", false, System.Globalization.CultureInfo.InvariantCulture));
}
}
The results are:
Compiling with mono (gmcs) Compiling with .Net (csc)
Running with mono -1 -1
Running with .Net -1 0
How can it output different values, when running with the .Net framework?
(BTW, according to http://msdn.microsoft.com/en-us/library/system.string.aspx the output should be 0, so mono's answer is incorrect, but that's unrelated to my question.)
Even the generated IL code is (almost) the same.
Compiling with .Net:
.method public hidebysig static void Main() cil managed
{
.entrypoint
// Code size 29 (0x1d)
.maxstack 8
IL_0000: nop
IL_0001: ldstr bytearray (61 00 6C 00 6F 00 00 00 61 00 6C 00 6F 00 00 00 ) // a.l.o...a.l.o...
IL_0006: ldstr bytearray (61 00 6C 00 6F 00 00 00 61 00 6C 00 6F 00 00 00 // a.l.o...a.l.o...
00 00 )
IL_000b: ldc.i4.0
IL_000c: call class [mscorlib]System.Globalization.CultureInfo [mscorlib]System.Globalization.CultureInfo::get_InvariantCulture()
IL_0011: call int32 [mscorlib]System.String::Compare(string,
string,
bool,
class [mscorlib]System.Globalization.CultureInfo)
IL_0016: call void [mscorlib]System.Console::WriteLine(int32)
IL_001b: nop
IL_001c: ret
} // end of method Program::Main
Compiling with mono:
.method public hidebysig static void Main() cil managed
{
.entrypoint
// Code size 27 (0x1b)
.maxstack 8
IL_0000: ldstr bytearray (61 00 6C 00 6F 00 00 00 61 00 6C 00 6F 00 00 00 ) // a.l.o...a.l.o...
IL_0005: ldstr bytearray (61 00 6C 00 6F 00 00 00 61 00 6C 00 6F 00 00 00 // a.l.o...a.l.o...
00 00 )
IL_000a: ldc.i4.0
IL_000b: call class [mscorlib]System.Globalization.CultureInfo [mscorlib]System.Globalization.CultureInfo::get_InvariantCulture()
IL_0010: call int32 [mscorlib]System.String::Compare(string,
string,
bool,
class [mscorlib]System.Globalization.CultureInfo)
IL_0015: call void [mscorlib]System.Console::WriteLine(int32)
IL_001a: ret
} // end of method Program::Main
The only difference is the two extra NOP instructions in the .Net version.
How is it possible? How can the two output values be different?
Also, if anyone has both .Net and mono installed, can you reproduce it?
EDIT: I don't care what the correct result is, and I don't care that mono and .Net produces different results. I'll probably never encounter embedded nulls AND sort them AND the sorting order will be important.
My problem is that the same runtime (.Net 2.0) produces different results, when compiled by different compilers.
EDIT 2: I added a table and tried to clarify the question, it should be easier to understand now.
My guess is that when you're compiling it with Mono, it's referencing the .NET 2.0 version of mscorlib - whereas when you compile it with VS, it's targeting .NET 4.0.
I may be incorrect about which exact version is being targeted in each case, but that's where I'd look to start with: don't look at the IL for the method, look at the referenced assemblies.
(It may help if you'd say which versions of VS, .NET and Mono you've got installed, btw.)
EDIT: Okay, so if it does the same thing regardless of what version you target, how about running a diff on the results of running ildasm on each version? Compare the whole files, not just the IL for the method call itself.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With