A native DotNet application will load all referenced assemblies (and their references) on first use. However, an ASP.NET will load all referenced assemblies (and their references) on first access.
Is this understanding correct?
Is there a way to force ASP.NET to load assemblies on demand (like local applications)?
The specific scenario I am trying to resolve is:
To answer point 3, the setting that causes all the assemblies in the bin folder to be loaded on first access can be found in the file C:\winnt\Microsoft.NET\Framework\v2.0.50727\CONFIG\web.config (depending on your environment). Cut-down extract from that file:
<system.web>
<compilation>
<assemblies>
<add assembly="*" />
</assemblies>
</compilation>
</system.web>
All assemblies that match the wildcard are loaded as part of the initial compilation.
By modifying the web.config for the application (NOT the global DotNet one) to include the web service assembly and exclude the wildcard match, it appears that the application can function if the optional dependencies are missing:
<system.web>
<compilation>
<assemblies>
<remove assembly="*" />
<add assembly="Main.Application.WebService, Version=1.0.0.0, Culture=neutral, PublicKeyToken=YOURKEYHERE" />
</assemblies>
</compilation>
</system.web>
We're still experimenting with this so not sure if this completely resolves the issue or has any unusual side-effects.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With