My function is converting a string to Decimal
func getDecimalFromString(_ strValue: String) -> NSDecimalNumber {
let formatter = NumberFormatter()
formatter.maximumFractionDigits = 1
formatter.generatesDecimalNumbers = true
return formatter.number(from: strValue) as? NSDecimalNumber ?? 0
}
But it is not working as per expectation. Sometimes it's returning like
Optional(8.300000000000001)
Optional(8.199999999999999)
instead of 8.3 or 8.2. In the string, I have value like "8.3" or "8.2" but the converted decimal is not as per my requirements. Any suggestion where I made mistake?
Setting generatesDecimalNumbers
to true
does not work as one might expect. The returned value is an instance of NSDecimalNumber
(which can represent the value 8.3 exactly), but apparently the formatter converts the string to a binary floating number first (and that can not represent 8.3 exactly). Therefore the returned decimal value is only approximately correct.
That has also been reported as a bug:
NSDecimalNumber
s from NSNumberFormatter
are affected by binary approximation errorNote also that (contrary to the documentation), the maximumFractionDigits
property has no effect when parsing a string
into a number.
There is a simple solution: Use
NSDecimalNumber(string: strValue) // or
NSDecimalNumber(string: strValue, locale: Locale.current)
instead, depending on whether the string is localized or not.
Or with the Swift 3 Decimal
type:
Decimal(string: strValue) // or
Decimal(string: strValue, locale: .current)
Example:
if let d = Decimal(string: "8.2") {
print(d) // 8.2
}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With