I think this must be a basic math operation but I really can't figure it out. I need to get an array of values between 0 and 1 based on values from a given array. So, for example, if we have an initial array [24, 128, 52]
the resulting array should be [0, 1, 0.5]
. Maximum and minimum values should be always be 1 and 0 respectively and the rest should be proportionately in between them.
How can I do this in Swift?
Start with an array:
let sourceArray = [24.0, 128.0, 52.0]
Store the minimum and maximum elements:
let min = minElement(sourceArray) // 24.0
let max = maxElement(sourceArray) // 128.0
Normalize each element and map the results to a new array:
let results = sourceArray.map { ($0 - min) / (max - min) }
results
is:
[0, 1, 0.2692307692307692]
You specified you wanted [0, 1, 0.5]
, but I think that's wrong, since you said "the rest should be proportionately in between them". Inputting [24.0, 128.0, 76.0]
would output [0, 1, 0.5]
.
Note: the source array can't be [Int]
or the map operation will round the result to either 1 or 0. If you need to work with [Int]
you'll have to convert each element in the map
operation.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With