Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

index(of: Int) method of Range in swift return incorrect value

When using index(of: Int) method on a half open range object, it always returns an incorrect value if the range does not start at 0. See the code below.

let range = (3 ..< 10)
let indexOfRange = range.index(of: 5) // return 5


let array = Array(5 ..< 10)
let indexOfArray = array.index(of: 5)  // returns 0

I don't understand why such result is produced. Can anyone please explain?

like image 904
Bing Avatar asked Mar 07 '23 02:03

Bing


1 Answers

Indices are opaque objects. If it's not an array, you shouldn't assume they are zero-based or even that they are integers (for an example see String.Index). To get a zero-based integer index, you need to get the distance from the startIndex:

let range = (3 ..< 10)
let opaqueIndex = range.index(of: 5)
let integerIndex = range.distance(from: range.startIndex, to: opaqueIndex!)
print(integerIndex) // 2

However, for Int ranges that's basically the same as:

let integerIndex = 5 - range.lowerOffset

The interesting part is that it seems ranges cannot be subscripted (ambiguous definition), therefore there is probably no point to get the index in the first place.

like image 105
Sulthan Avatar answered Mar 17 '23 01:03

Sulthan