This question is partly about delegates, and partly about generics.
Given the simplified code:
internal sealed class TypeDispatchProcessor
{
private readonly Dictionary<Type, Delegate> _actionByType
= new Dictionary<Type, Delegate>();
public void RegisterProcedure<T>(Action<T> action)
{
_actionByType[typeof(T)] = action;
}
public void ProcessItem(object item)
{
Delegate action;
if (_actionByType.TryGetValue(item.GetType(), out action))
{
// Can this call to DynamicInvoke be avoided?
action.DynamicInvoke(item);
}
}
}
I read elsewhere on SO that invoking a delegate directly (with parenthesis) is orders of magnitude faster than calling DynamicInvoke
, which makes sense.
For the code sample above, I'm wondering whether I can perform the type checking and somehow improve performance.
Some context: I have a stream of objects that get farmed out to various handlers, and those handlers can be registered/unregistered at runtime. The above pattern functions perfectly for my purposes, but I'd like to make it snappier if possible.
One option would be to store Action<object>
in the Dictionary
, and wrap the Action<T>
delegates with another delegate. I haven't yet compared the performance change that this second indirect call would affect.
I strongly suspect that wrapping the calls would be a lot more efficient than using DynamicInvoke
. Your code would then be:
internal sealed class TypeDispatchProcessor
{
private readonly Dictionary<Type, Action<object>> _actionByType
= new Dictionary<Type, Action<object>>();
public void RegisterProcedure<T>(Action<T> action)
{
_actionByType[typeof(T)] = item => action((T) item);
}
public void ProcessItem(object item)
{
Action<object> action;
if (_actionByType.TryGetValue(item.GetType(), out action))
{
action(item);
}
}
}
It's worth benchmarking it, but I think you'll find this a lot more efficient. DynamicInvoke
has to check all the arguments with reflection etc, instead of the simple cast in the wrapped delegate.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With