I can create a literal long by appending an L to the value; why can't I create a literal short or byte in some similar way? Why do I need to use an int literal with a cast?
And if the answer is "Because there was no short literal in C", then why are there no short literals in C?
This doesn't actually affect my life in any meaningful way; it's easy enough to write (short) 0 instead of 0S or something. But the inconsistency makes me curious; it's one of those things that bother you when you're up late at night. Someone at some point made a design decision to make it possible to enter literals for some of the primitive types, but not for all of them. Why?
You can use a byte literal in Java... sort of. byte f = 0; f = 0xa; 0xa (int literal) gets automatically cast to byte. It's not a real byte literal (see JLS & comments below), but if it quacks like a duck, I call it a duck.
Literal in Java is a synthetic representation of boolean, numeric, character, or string data. It is a means of expressing particular values in the program, such as an integer variable named ''/count is assigned an integer value in the following statement. int count = 0; A literal '0' represents the value zero.
short: The short data type is a 16-bit signed two's complement integer. It has a minimum value of -32,768 and a maximum value of 32,767 (inclusive). As with byte , the same guidelines apply: you can use a short to save memory in large arrays, in situations where the memory savings actually matters.
In C, int
at least was meant to have the "natural" word size of the CPU and long
was probably meant to be the "larger natural" word size (not sure in that last part, but it would also explain why int
and long
have the same size on x86).
Now, my guess is: for int
and long
, there's a natural representation that fits exactly into the machine's registers. On most CPUs however, the smaller types byte
and short
would have to be padded to an int
anyway before being used. If that's the case, you can as well have a cast.
I suspect it's a case of "don't add anything to the language unless it really adds value" - and it was seen as adding sufficiently little value to not be worth it. As you've said, it's easy to get round, and frankly it's rarely necessary anyway (only for disambiguation).
The same is true in C#, and I've never particularly missed it in either language. What I do miss in Java is an unsigned byte type :)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With