Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why does Scala implicitly convert Char to Int?

Tags:

types

scala

Looking at scala's Predef object, which is automatically imported, I found the following gem

implicit def char2int(x : Char) : Int

This has caused some sleazy bugs to sneak into my code (used _1 instead of _2 in Map[Char,Int]). I really don't get it, why would I want to implicitly convert Char to Int. The whole idea of having the Char type (which is a mere number) is so that I won't use it as a number (or vice versa).

I use scala's type system in order not to have errors like that!

The only (bad) excuse I though about is to be compatible with Java's horrible behaviour.

update: The main reason given by the two answers given so far, is that the implicit conversion is done to support ordered actions upon the Char type. So that for instance 'c'+1 would generate d. If that's what you want you should do

class Char ...
    ...
    def +(x:Int) = (this.toInt+x).toChar
    def <(x:Char) = this.toInt < x.toInt

and you could add and compare characters to your liking. The fact that Char is the only 16-bits number only means we need a new Word (or Short) type.

like image 984
Elazar Leibovich Avatar asked Sep 26 '09 18:09

Elazar Leibovich


1 Answers

Well, if you think in terms of how a Char is represented, a Char is jusr an unsigned 16 bit field, with a range from (0 to 2^16 - 1). This can fit without an overflow in an Int (32 bit signed, with a range from -2^31 to 2^31 - 1).

Some of Scala's basic types are, in order of the length of their representation in bits:

  • Byte (8)
  • Char (16)
  • Int (32)
  • Long (64)

All are signed, except for Char and all are convertible to a type "below" it as they would never over/under-flow (except for Byte to Char, which doesn't exist). See all of those implicit conversions in the Predef.

This is the reason I believe the implicit conversions exist - to allow expressions like the following to exist:

def foo(c: Char) { println(c) }
foo('a' + 2) // prints c

Another explanation is along the lines of the one you have given (a Char is just a number...). For me it does make sense - the set of all Chars is included in the set of all Ints, and therefore, applying my own guidelines for using implicits, the conversion should really be implicit.

I do understand your annoyance as I like compilers to signal errors like the one you have just gave as an example. It would be nice if Scala had a way to turn implicit conversion off (or turn specific implicit conversions off, as turning them all off would probably wreck havoc!)

The only solution I see for your problem is using Map[RichChar, Int] or something similar - RichChar is implicitly converted to an Int, as implicit conversions cannot be chained. EDIT found out that there actually is no implicit conversion from RichChar to Char.

def foo(x: Int) = x + 1

import scala.runtime.RichChar

val ch = 'a'
val rch = new RichChar('a')

foo(ch) // compiles fine
// foo(rch) // does not compile

def bar(ch: Char) = println(ch) 

// bar(rch) // oops... does not compile
implicit def rch2char(rch: RichChar): Char = rch.self.asInstanceOf[Char]

bar(rch) // yay!

EDIT: Actually, if you have a good look at the Scala API, Char does have an overloaded + method which takes an Int argument. Same goes for Int. This could have to do with the fact that the underlying JVM does something similar.

Also note that the example I have gave you had nothing to do with allowing adding Ints to Chars! This is already allowed by the API. The more subtle point is that when you add an Int to a Char, you get an Int. The implicit conversion is there to allow using the result of this addition as a Char.

Also note the more theoretical answer I have given - Char is a subset of Int !

-- Flaviu Cipcigan

like image 190
Flaviu Cipcigan Avatar answered Sep 29 '22 14:09

Flaviu Cipcigan