julia> typeof(-0b111)
Uint64
julia> typeof(-0x7)
Uint64
julia> typeof(-7)
Int64
I find this result a bit surprising. Why does the numeric base of the number determine signed or unsgined-ness?
Looks like this is expected behavior:
This behavior is based on the observation that when one uses unsigned hex literals for integer values, one typically is using them to represent a fixed numeric byte sequence, rather than just an integer value.
http://docs.julialang.org/en/latest/manual/integers-and-floating-point-numbers/#integers
...seems like a bit of an odd choice.
This is a subjective call, but I think it's worked out pretty well. In my experience when you use hex or binary, you're interested in a specific pattern of bits – and you generally want it to be unsigned. When you're just interested a numeric value you use decimal because that's what we're most familiar with. In addition, when you're using hex or binary, the number of digits you use for input is typically significant, whereas in decimal, it isn't. So that's how literals work in Julia: decimal gives you a signed integer of a type that the value fits in, while hex and binary give you an unsigned value whose storage size is determined by the number of digits.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With