In Python, you can do that:
arr = [1,2,3]
arr[-1] // evaluates to 3
But in JS, you can't:
let arr = [1,2,3];
arr[-1]; // evaluates to undefined
The question is: why?
I know about the tricks to get around it (arr[arr.length-1]
, modifying the array prototype, etc), but that is not the point.
I'm trying to understand why it is still not in the EcmaScript standards to interpret negative array indices as indices starting from the end, despite that it seems pretty easy to implement a JS engine that understands that (and also, the whole Python community is having a blast with this notation).
What am I missing?
You miss the point, that arrays are objects (exotic object) and -1
is a valid key.
var array = [1, 2, 3];
array[-1] = 42;
console.log(array);
console.log(array[-1]);
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With