I've seen several questions that are the opposite of this; "How do I disable virtualization?" That is not my question. I want to force an application to run with virtualization enabled.
I have an application that ran just fine under Windows XP, but, because it writes its configuration to its working directory (a subfolder of "C:\Program Files (x86)"), it does not work completely under Windows 7. If I use task manager to turn on UAC Virtualization, it saves its config just fine, but of course it then can't load that config.
I do not want to set it to run as administrator, as it does not need those privileges. I want to set it to run with UAC Virtualization enabled.
I found a suggestion that I put some magic in the registry at HKLM\SOFTWARE\Microsoft\Windows NT\CurrentVersion\AppCompatFlags
. For completeness I also put it in Wow6432Node
, but neither had any effect.
File system is virtualized in certain scenarios, so is your question how to still turn it on when your application does not qualify? It is unlikely possible, MSDN:
Virtualization is not in option in the following scenarios:
Virtualization does not apply to applications that are elevated and run with a full administrative access token.
Virtualization supports only 32-bit applications. Non-elevated 64-bit applications simply receive an access denied message when they attempt to acquire a handle (a unique identifier) to a Windows object. Native Windows 64-bit applications are required to be compatible with UAC and to write data into the correct locations.
Virtualization is disabled for an application if the application includes an application manifest with a requested execution level attribute.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With