I'm taking my first foray into scalaz by converting an existing class to use the Monoid trait. What I am trying to achieve is to set a view bound on my class type parameter to ensure that it can only be used with types that can be implicitly converted to a Monoid. My (simplified) class definition is thus:
import scalaz._
import Scalaz._
case class Foo[T <% Monoid[T]](v: T)
new Foo(42)
Compiling this simple example gives the compiler error:
error: No implicit view available from Int => scalaz.Monoid[Int].
Previously this view bound was defined against my own custom trait with an implicit conversion from T to the trait and this worked fine.
What am I missing now that I have converted this to scalaz?
Thanks, Chris
You are supposed to be using a context bound, and not a view bound there.
import scalaz._
import Scalaz._
case class Foo[T : Monoid](v: T)
new Foo(42)
The T : Monoid
notation means that there is an implicit of type Monoid[T]
in scope. In fact, it desugars to the following:
case class Foo[T](v: T)(implicit ev: Monoid[T])
This is known as type class pattern and you can read more about it here.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With