I didn't expect the following code to work:
foo :: (Num a) => a -> a
foo x = x + x
main = do
print (foo (read "7"))
because it is not possible to fully infer the type of (read "7") based on the code. But GHC (6.12.3) thinks otherwise and prints 14.
If "7" is changed to "7.2", the code fails with "no parse". What's going on here? how is Haskell deciding which instance of Read to use?
This is caused by Haskell's defaulting rules for the Num
class. If you added
default (Double, Integer)
to the top of your file, then you'd get the following results:
main = do
print (foo (read "7")) -- prints "14.0"
print (foo (read "7.2")) -- prints "14.2"
In a nutshell, defaulting rules are an attempt to "try to do the right thing" and save you from a compile error when you have an ambiguous type in your program. Unfortunately in this case it trades a compile-time error for a runtime error.
You can disable defaulting like so:
default ()
which will force you to explicitly disambiguate the types of such terms via type annotations:
print (foo (read "7" :: Int))
Int is the default type in this instance. See sec. 6.3, Ambiguity and Type Defaulting, in A History of Haskell: Being Lazy with Class,
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With