Using Visual Studio 2013 Pro, I have built my first small console application. This console app should be quick to run on each computer within our network. We are running at least Windows XP SP3 up to Windows 8.1.
Therefor i have created an app to target .NET 2.0 but to my surprise a Windows 8 machine complained it had to install/activate .NET 3.5 for this. Windows 8 does not have this activated by default!
When i target 2.0, it runs on al our XP's and W7 deviced but not on W8. When i target 4.0, it runs on all W8's and some W7's but not on XP.
This is the same application.. so is there really no way to tell the OS to use the available framework?
I have tried adding a config file 'MyAppName.exe.config' and 'App.config' to the project containing:
<?xml version="1.0" encoding="utf-8" ?>
<configuration>
<startup>
<supportedRuntime version="v2.0.50727"/>
<supportedRuntime version="v4.0"/>
</startup>
</configuration>
I swapped the order and targetted .NET 2.0, 3.0, 3.5 and 3.5 Client Profile. But when not targeting >= .NET 4, W8 keeps asking to activate .NET 3.5
How can I get this small app working on XP SP3, W7 and W8 without the need of installing any extra frameworks when the client has .NET 2.0 or higher installed??
I believe you simply missed startup
element:
<?xml version="1.0" encoding="utf-8" ?>
<configuration>
<startup>
<supportedRuntime version="v2.0.50727"/>
<supportedRuntime version="v4.0"/>
</startup>
</configuration>
Just try to build it again with target version 3.5 using this configuration.
On of the other options. Do you have .NET Framework 4.0 installed on Windows XP and Windows 8? If so, you can try to use this version as a target version when you build your app.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With