Recently I took a look at Haskell, using LYAH.
I was messing around with type classes and wrote this quick test function:
foo :: (Num x) => x -> String
foo x = show x ++ "!"
But that produces this error:
test.hs:2:9:
Could not deduce (Show x) arising from a use of `show'
from the context (Num x)
bound by the type signature for foo :: Num x => x -> String
at test.hs:1:8-29
Possible fix:
add (Show x) to the context of
the type signature for foo :: Num x => x -> String
But according to LYAH:
To join Num, a type must already be friends with Show and Eq.
So if everything in Num
is a subset of Show
and Eq
, why do I need to change the type signature to foo :: (Num x, Show x) => x -> String
for this to work? Shouldn't it be possible to infer that a Num
is also Show-able?
The information in LYAH is old. The release notes for GHC 7.4.1 say that:
The Num class no longer has Eq or Show superclasses.
You will need to write,
foo :: (Num x, Show x) => x -> String
(In fact, the foo
you wrote doesn't require Num x
, so you can omit that to avoid an unnecessary constraint.)
It used to be that an instance of Num
was also an instance of Show
and Eq
, but that's no longer the case.
You'll need to add a Show
constraint as well.
Haskell, both 98 and 2010 both require all instances of Num to also be instances on Show and Eq. This is largely an accident of history.
GHC, the most popular Haskell compiler, diverges from the standard here without requiring any pragma. This was done to allow applicative functors to be instances of Num and enjoy the benefits of overloaded syntax.
Shouldn't you write:
(Num x) => x -> String
Instead of
(Num x) x -> String
And as far as I know this inheritance is at least outdated.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With