def multiplyStringNumericChars(list: String): Int = {
var product = 1;
println(s"The actual thing + $list")
list.foreach(x => { println(x.toInt);
product = product * x.toInt;
});
product;
};
This is a function that takes a String like 12345
and should return the result of 1 * 2 * 3 * 4 * 5
. However, I'm getting back doesn't make any sense. What is the implicit conversion from Char
to Int
actually returning?
It appears to be adding 48
to all the values. If instead I do product = product * (x.toInt - 48)
the results are correct.
It does make sense: that is the way how characters encoded in ASCII table: 0 char maps to decimal 48, 1 maps to 49 and so on. So basically when you convert char to int, all you need to do is to just subtract '0':
scala> '1'.toInt // res1: Int = 49 scala> '0'.toInt // res2: Int = 48 scala> '1'.toInt - 48 // res3: Int = 1 scala> '1' - '0' // res4: Int = 1
Or just use x.asDigit
, as @Reimer said
scala> '1'.asDigit // res5: Int = 1
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With