I'm just curious how .NET defines a process architectural interface if I compile the source code under "Any CPU" configuration setting. I always thought that if you run that process in a x64 computer, it will be a 64-bit process. However, the example below shows a totally different thing.
I have a simple console program with code like this:
static void Main(string[] args)
{
Console.WriteLine("Process Type: {0}", Environment.Is64BitProcess?"64 Bit":"32 Bit" );
Console.ReadLine();
}
and the configuration setting is like this:
And my processor is 64 bit:
Finally, the result shows
Could you please give some insights?
See this Microsoft blog post, which says:
In .NET 4.5 and Visual Studio 11 the cheese has been moved. The default for most .NET projects is again AnyCPU, but there is more than one meaning to AnyCPU now. There is an additional sub-type of AnyCPU, "Any CPU 32-bit preferred", which is the new default (overall, there are now five options for the /platform C# compiler switch: x86, Itanium, x64, anycpu, and anycpu32bitpreferred). When using that flavor of AnyCPU, the semantics are the following:
- If the process runs on a 32-bit Windows system, it runs as a 32-bit process. IL is compiled to x86 machine code.
- If the process runs on a 64-bit Windows system, it runs as a 32-bit process. IL is compiled to x86 machine code.
- If the process runs on an ARM Windows system, it runs as a 32-bit process. IL is compiled to ARM machine code.
Turning "Prefer 32-bit" off will disable this behavior.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With