I have an enum for keywords and operators (and some other too), e.g. (all are similar):
object Keywords extends Enumeration {
val AND, ARRAY, BEGIN, ...= Value
case class Keyword(keyword: Value) extends Token[Value] {
def this(keyword: String) = this(Keywords.fromString(keyword))
def value = keyword
}
implicit def valueToKeyword(keyword: Value) = new Keyword(keyword)
}
this implicit conversion allows me to pass enum values where Token
s are expected e.g.
def testFunction[T](t: Token[T]) = ...
testFunction(Keywords.ARRAY) // gets converted
testFunction(Operators.PLUS) // gets converted too
it also seems that the same implicit conversion is not applied during matching i.e.
val token = new Keyword("ARRAY")
token match {
case Keywords.ARRAY => ... // not selected but SHOULD be
case Operators.PLUS => ... // completely different Enum
...
}
Why? How to overcome this?
An implicit conversion from type S to type T is defined by an implicit value which has function type S => T , or by an implicit method convertible to a value of that type. Implicit conversions are applied in two situations: If an expression e is of type S , and S does not conform to the expression's expected type T .
Pattern matching is a way of checking the given sequence of tokens for the presence of the specific pattern. It is the most widely used feature in Scala. It is a technique for checking a value against a pattern. It is similar to the switch statement of Java and C.
Implicit conversions in Scala are the set of methods that are apply when an object of wrong type is used. It allows the compiler to automatically convert of one type to another. Implicit conversions are applied in two conditions: First, if an expression of type A and S does not match to the expected expression type B.
Implicit conversions are evil, for several reasons. They make it hard to see what goes on in code. For instance, they might hide bad surprises like side effects or complex computations without any trace in the source code.
This doesn't work because:
token match {
case Keywords.ARRAY => println("Array")
case _ => println("Something else")
}
is essentially a PartialFunction with the following type signature: PartialFunction[Keywords.Value, Unit]
. Which means an implicit won't be applied, because it's either isDefinedAt or it isn't for that input.
If it isn't defined than case _ => ...
will catch everything in my example code. If it's not defined at all and nothing will match it then you will get a MatchError thrown.
In your case token
of type Token[Value]
isn't defined in the Partial Function that the match will compile to, because only types of Keywords.Value
are defined.
If you really want implicits then you could explicitly ask for an implicit (yes, that sentence is funny :))
implicitly[Keywords.Value](token) match {
case Keywords.ARRAY => println("Array")
case _ => println("Something else")
}
Or you can explicitly state the type of token
, to invoke the implicit magic:
val token: Keywords.Value = new Keyword("ARRAY")
token match {
case Keywords.ARRAY => println("Array")
case _ => println("Something else")
}
Or the simplest solution if you can live without implicits:
token.value match {
case Keywords.ARRAY => println("Array")
case _ => println("Something else")
}
I know it's not the answer you're looking for, but I hope that you understood what match {...}
really means and what Partial Functions are.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With