I would like to know whether the usage of Attributes in .Net, specifically C#, is expensive, and why or why not?
I am asking about C# specifically, unless there is no difference between the different .Net languages (because the base class libraries are the same?).
All the newer .Net technologies make extensive use of attributes, such as Linq to SQL, ASP.Net MVC, WCF, Enterprise Library, etc, and I was wondering what effect this would have on performance. Alot of the classes get automatically decorated with certain Attributes, or these attributes are required for certain functionality/features.
Does the question of expense depend on implementation specific details? How are Attributes compiled to IL? Are they cached automatically, or is this up to the implementor?
"The usage of attributes" is too vague. Fetching the attributes is a reflection operation effectively - you wouldn't want to regularly do it in a loop - but they're not expensive to include in the metadata, and the typical usage pattern (IMO) is to build some other representation (e.g. an in-memory schema) after reading the attributes once.
There may well be some caching involved, but I'd probably cache the other representation anyway. For example, if I were decorating enum values with descriptions, I'd generally fetch the attributes once to build a string to enum dictionary (or vice versa).
It depends on how you use them... Some attributes are just for information purpose (ObsoleteAttribute for instance), so they don't have any impact on runtime performance. Other attributes are used by the compiler (like DllImportAttribute) or by post-compilers like PostSharp, so the cost is at compile time, not run-time. However, if you use reflection to inspect attributes at runtime, it can be expensive.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With