I've noticed in VS 2010 the default platform target for a C# project is x86 (it used to be any CPU), and was wondering why the change.
Does the compiler perform any optimizations based on fixing the platform to x86 vs x64 vs Any CPU?
Should I be forcing my apps to one platform or the other for performance reasons?
The previous version of Visual Studio used to default this to "Any CPU", which means that on a x86 machine you would always end up using x86, wheras on a x64 machine you would end up either running x64 or x86 depending on whether or not the process that the assembly is being loaded into was 32bit or 64bit.
The trouble is that when starting a new process a .Net exe built with the "Any CPU" option will end up as a 64 bit process rather than as a 32 bit process which can cause problems for 2 reasons:
As so few applications actually use enough address space (i.e. memory) to make the hassle worthwhile, the default was changed to x86 in order to avoid these problems.
You'll notice that libraries still default to Any CPU and libaries should always be Any CPU so if there were optimizations, they would only apply to EXEs and that doesn't make sense. No, the issue is that Any CPU executables are usually more hassle than they are worth except in the hands of those that know what they are doing. And for those that know what they want, the defaults aren't a serious problem.
I'll add that I didn't agree with this policy initially but since the x86 debugging experience is superior, I've come to accept it as what it is: the default.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With