Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Silverlight 4.0 and WCF client proxy - how to create and how to close instances

Topic of Silverlight WCF service proxy lifecycle is not very clear to me. I have read various materials, resources, answers here, but still I don't completely understand the supposed best way to use them.

I am using custom binary binding in Silverlight 4.0 currently.

Is creation of a proxy in silverlight an expensive operation? Should we try to share proxy instance in code or create new is better? Should we do locking if we do share in case multiple threads access it?

Since an error on a proxy will fault the state of a proxy I think sharing a proxy isn't a good idea, but Ive read that creation is expensive, so its not 100% clear what to do here.

And with closing - silverlight WCF service clients only provide CloseAsync method. Also proxies require certain logic to be used when they are closed ( if they are faulted we should call Abort() which is synchronous in Silverlight and if not we should CloseAsync which is not synchronous or what?).

In many official Silverlight samples from MS proxies are not closed whatsoever , is that just flaw of materials or expected approach to them?

Topic is very important to me and I want a clear understanding of all things that should be considered which I currently don't have.

( I did see that this question What is the proper life-cycle of a WCF service client proxy in Silverlight 3? appears close to mine but I cannot say I am satisfied with quality of answers)

I would really like to see sample code that uses, creates , closes etc WCF proxies, and most importantly explains, why that is the best possible way. I also think (currently believe) that because of nature of problem, there should be a single, general use best practice/pattern - approach to use (create,reuse,close) WCF proxies in Silverlight.

like image 434
Valentin Kuzub Avatar asked Aug 19 '11 08:08

Valentin Kuzub


1 Answers

Summary: I believe the best practice is to instantiate your web service client when you are about to use it, then let it go out of scope and get garbage collected. This is reflected in the samples you see coming from Microsoft. Justification follows...

Full:The best full description of the process that I have found is at How to: Access a Service from Silverlight. Here the example shows the typical pattern of instantiating the web service client and allowing it to go out of scope (without needing to close it). Web service clients inherit from ClientBase which has a Finalize method that should free any unmanaged resources if necessary when the object is garbage collected.

I have a decent amount of experience using web services, and I use proxies and instantiate them right before use, then allow them to be garbage collected. I have never had a problem with this approach. I read on Wenlong Dong's Blog which said that creation of the proxy was expensive, but even he says performance has improved in .NET 3.5 (perhaps it has improved again since then?). What I can tell you is that performance is a relative term, and unless your data being retrieved is less than trivial in size, far more time will be spent in serializing/deserializing and transport than creating the connection. This has certainly been my experience, and you are better off optimizing in those areas first.

Last, since I figure my opinions thus far may be insufficient, I wrote a quick test. I created a Silverlight enabled web service using the template provided with Visual Web Developer 2010 Express (with a default void method called DoWork()). Then in my sample Silverlight client I called it using the following code:

int counter=0;
public void Test()
{
    ServiceReference1.Service1Client client = new ServiceReference1.Service1Client();
    client.DoWorkCompleted += (obj, args) => 
    { 
        counter++;
        if (counter > 9999)
        {
            for(int j=0;j<10;j++) GC.Collect();
            System.Windows.MessageBox.Show("Completed");
        }
    };
    client.DoWorkAsync();
}

I then called the Test method using for(int i=0;i<10000;i++) Test(); and fired up the application. It took a little over 20 seconds to load up the app & complete the web service calls (all 10,000 of them). As the web service calls were being made I saw the memory usage for the process jump to over 150MB, but once the calls completed and GC.Collect() was called the memory usage dropped to less than half that amount. Far from being a perfect test it seems to confirm to me that no memory was leaking, or it was negligible (considering it is probably uncommon to call 10,000 web service calls all using separate client instances). Also it is a much simpler model than keeping a proxy object around and having to worry about it faulting and having to reopen it.

Justification of Test Methodology: My test focused on 2 potential problems. One is a memory leak, and the other is processor time spent creating and destroying the objects. My recommendation is that it is safe to follow the examples provided by the company (Microsoft) who supplies the classes. If you are concerned about network efficiency, then you should have no problem with my example since properly creating/disposing these objects would not effect network latency. If 99% of the time spent is network time, then optimizing for a theoretical improvement in the 1% is probably wasteful in terms of development time(assuming there is even a benefit to be gained which I believe my test clearly shows there is little/none). Yes, the networking calls were local which is to say that over the course of 10,000 service calls, only about 20 seconds will be spent waiting for the objects. That represents ~2 milliseconds per service call spent on creating the objects. Regarding the need to call Dispose, I didn't mean to imply that you shouldn't call it, merely that it didn't appear necessary. If you forget (or simply choose not to), my tests led me to believe that Dispose was being called in the Finalize for these objects. Even so, it would probably be more efficient to call Dispose yourself, but still the effect is negligible. For most software development you get more gains from coming up with more efficient algorithms and data structures than by pining over issues like these (unless there is a serious memory leak). If you require more efficiency, then perhaps you shouldn't be using web services since there are more efficient data transit options than a system that is based on XML.

like image 56
Adam Jones Avatar answered Oct 09 '22 10:10

Adam Jones