I understand that we can compile a .NET Application by targeting AnyCPU which will cause to run 32bit in a 32bit OS and 64bit in a 64bit OS.
However there was a reported bug* on a 64bit OS that my app was giving an error and the solutions for that, I need to target x86.
Now my question: Is it really bad to target x86 even though when your code is going to run in x64? What sort of performance we are talking about? (my application is quite CPU intensive but it's really hard to come up with )
After all .NET Framework will run in 32bit which sounds bad to me instead of taking the full addressing power of x64 CPU**.
*I can't remember the bug but the solution was targeting x86 specifically, and solved the problem.
** I'm not sure if it's any important but my application doesn't use any Int64 variables.
To put it in simple words, if you run a 32-bit program on a 64-bit machine, it will work fine, and you won't encounter any problems. Backward compatibility is an important part when it comes to computer technology. Therefore, 64 bit systems can support and run 32-bit applications.
Can I run 32-bit programs on a 64-bit computer? Most programs made for the 32-bit version of Windows will work on the 64-bit version of Windows except for most Antivirus programs. Device drivers that are made for the 32-bit version of Windows will not work correctly on a computer running a 64-bit version of Windows.
Short answer, yes. In general any 32 bit program runs slightly faster than a 64 bit program on a 64 bit platform, given the same CPU.
The 64-bit versions of Windows use the Microsoft Windows-32-on-Windows-64 (WOW64) subsystem to run 32-bit programs without modifications. The 64-bit versions of Windows don't provide support for 16-bit binaries or 32-bit drivers.
No, it's not bad; In fact, for an application I'm working on, I have to target x86 (as it brings in COM objects, for which the vendor doesn't support x64)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With