I was trying to compile this code.
symmetric [] = True
symmetric [_] = True
symmetric l
| (head l) == (last l) = symmetric (tail (init l))
| otherwise = False
isPalindrome :: Integral a => a -> Bool
isPalindrome n = symmetric (show n)
That code did not compile and I got a not very long error message saying that it cannot deduce (Show a).
Could not deduce (Show a) arising from a use of ‘show’
from the context (Integral a)
bound by the type signature for
isPalindrome :: Integral a => a -> Bool
at 4.hs:7:17-39
Possible fix:
add (Show a) to the context of
the type signature for isPalindrome :: Integral a => a -> Bool
In the first argument of ‘symmetric’, namely ‘(show n)’
In the expression: symmetric (show n)
In an equation for ‘isPalindrome’:
isPalindrome n = symmetric (show n)
It worked after changing this line
isPalindrome :: Integral a => a -> Bool
to
isPalindrome :: (Show a, Integral a) => a -> Bool
So I was thinking since every type in Integral is in Show,a Haskell compiler should be able to deduce (Show a) from (Integral a).
So I was thinking since every type in Integral is in Show
But not every type in Integral
is in Show
. That used to be the case in Haskell98, due to
class Show n => Num n
But that superclass relation prevents an awful lot of useful number types (“infinite-precision numbers”, global results of continuous functions, etc.). In modern Haskell, the classes Show
and Integral
have no relation at all, hence the compiler can't infer one from the other.
It is, however, indeed possible to show any integral number type independently of the actual Show
class; use the showInt
function for this.
import Numeric (showInt)
isPalindrome :: Integral a => a -> Bool
isPalindrome n = symmetric $ showInt n []
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With