I am looking to implement some throttling behavior into one of my viewmodels. It's a silverlight application, however I don't think that's particularly important.
Consider a class with three properties:
Whenever one of these properties is updated, a refresh is neccessary.
private void Refresh()
{
//Call out to the server, do something when it comes back
}
My goals are as follows:
If it matters, I am using a ChannelFactory implementation for the server call.
What kind of patterns can I use to accomplish this? Is this something reactive extensions can help me with?
Edit:
Marking Paul's answer as correct. While ReactiveUI doesn't currently work against silverlight5, it clearly outlines the approach / composition steps to solving the problem using Rx.
Here's how you would do this with ReactiveUI:
IObservable<TheData> FetchNewData()
{
// TODO: Implement me
}
this.WhenAny(x => x.Property1, x => x.Property2, x => x.Property3, (x,y,z) => Unit.Default)
.Throttle(TimeSpan.FromMilliseconds(200), RxApp.DeferredScheduler)
.Select(x => FetchNewData())
.Switch() // We only care about the req corresp. to latest values of Prop1-3
.ToProperty(this, x => x.Data);
Update: Here's how to guarantee that there is only one running at a time, with the caveat that you may get out-of-order results.
this.WhenAny(x => x.Property1, x => x.Property2, x => x.Property3, (x,y,z) => Unit.Default)
.Throttle(TimeSpan.FromMilliseconds(200), RxApp.DeferredScheduler)
.Select(_ => Observable.Defer(() => FetchNewData()))
.Merge(1)
.ToProperty(this, x => x.Data);
The behavior you describe, would actually be perhaps not desireable, since if the properties keep changing, you'll end up with a queue of old requests to issue - you could optimize this if you made something like a "BufferingSwitch()" operator that didn't return its results until it was sure there were no changes - that'd actually be cool to write.
Moral of the story, Async Is Complicated™ :)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With