When using index(of: Int) method on a half open range object, it always returns an incorrect value if the range does not start at 0. See the code below.
let range = (3 ..< 10)
let indexOfRange = range.index(of: 5) // return 5
let array = Array(5 ..< 10)
let indexOfArray = array.index(of: 5) // returns 0
I don't understand why such result is produced. Can anyone please explain?
Indices are opaque objects. If it's not an array, you shouldn't assume they are zero-based or even that they are integers (for an example see String.Index
). To get a zero-based integer index, you need to get the distance from the startIndex
:
let range = (3 ..< 10)
let opaqueIndex = range.index(of: 5)
let integerIndex = range.distance(from: range.startIndex, to: opaqueIndex!)
print(integerIndex) // 2
However, for Int
ranges that's basically the same as:
let integerIndex = 5 - range.lowerOffset
The interesting part is that it seems ranges cannot be subscripted (ambiguous definition), therefore there is probably no point to get the index in the first place.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With