I was experimenting with the const
modifier while exploring a plethora of C# tutorials, and placed a bunch of const
modifiers in a class like this without actually using them anywhere:
class ConstTesting
{
const decimal somedecimal = 1;
const int someint = 2;
...
}
With this class, I get the following warning (using csc) :
ConstTesting.cs(3,19): warning CS0414: The field ‘ConstTesting.somedecimal’ is assigned but its value is never used
What I don't understand is that I only get the warning for the const decimal
. The const int
doesn't give me any warning, regardless of the order or anything like that.
My question is, why does this happen? Why would my csc compiler be warning me about a const
in the first place, and if it is then more importantly why would it only be warning me about const decimal
when I'm writing const int
in exactly the same way? What on earth would the difference between int
and decimal
have to do with it?
Please note:
Int is a simple value type of a fixed size. Decimal is a bit more complicated due to scale. If you decompile your code, you'll find that it looks like this:
[DecimalConstant(0, 0, 0, 0, 1)]
private readonly static decimal somedecimal;
private const int someint = 2;
Where the decimal is not a constant, but has a DecimalConstant attribute courtesy of mscorlib.dll, where the true definition if decimal is:
public struct Decimal : IFormattable, IComparable, IConvertible,
IDeserializationCallback, IComparable<decimal>, IEquatable<decimal>
Much more in-depth exploration of this topic is covered in this blog post.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With