I assume that an internal casting happens when we write: arr[i]
(which is equivalent to *(arr+i)
). Because i
can for example be a short
, int
or long
or the unsigned variant of any of these three.
So my question is simple: which type should i
be so that no internal conversion takes place? So that the code can run most efficiently?
Crude guess: size_t
?
It's unlikely to make any significant difference to performance. In any case, you should always be using the type that's semantically correct, not trying to make premature optimizations for things that aren't going to matter. In the case of indices, size_t
is the smallest type that's a priori correct, but you may be able to get away with smaller types if you know the array you're working with is bounded. Usually though you should just use size_t
to be safe.
My answer is to keep this simple, maybe you want to typedef
a type for you index like so
typedef size_t TPtrIdx;
and this way you can easily translate the type into whatever you want it to be, now or 1 year later.
Regarding the potential problems between conversions from/to signed and unsigned types, that's something that a good compiler can handle for you, for example gcc
offers the -Wconversion
flag, and is always a good thing to enable it.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With