Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why doesn't f=(+) need a type annotation?

I mean, for example,

f :: (Enum a) => a -> a --without this line, there would be an error
f = succ

It's because succ needs its parameter to be enumerable (succ :: (Enum a) => a -> a)

but for (+)

f = (+) --ok

Though (+)'s declaration is (+) :: (Num a) => a –> a –> a.

I mean, why don't I need to declare f as f :: (Num a) => a –> a –> a?

like image 790
sqd Avatar asked Feb 11 '15 15:02

sqd


2 Answers

Because of defaulting. Num is a 'defaultable' type class, meaning that if you leave it un-constrained, the compiler will make a few intelligent guesses as to which type you meant to use it as. Try putting that definition in a module, then running

:t f

in ghci; it should tell you (IIRC) f :: Integer -> Integer -> Integer. The compiler didn't know which a you wanted to use, so it guessed Integer; and since that worked, it went with that guess.

Why didn't it infer a polymorphic type for f? Because of the dreaded[1] monomorphism restriction. When the compiler sees

f = (+)

it thinks 'f is a value', which means it needs a single (monomorphic) type. Eta-expand the definition to

f x = (+) x

and you will get the polymorphic type

f :: Num a => a -> a -> a

and similarly if you eta-expand your first definition

f x = succ x

you don't need a type signature any more.

[1] Actual name from the GHC documentation!

like image 139
Jonathan Cast Avatar answered Oct 31 '22 07:10

Jonathan Cast


I mean, why don't I need to declare f as (+) :: (Num a) => a –> a –> a?

You do need to do that, if you declare the signature of f at all. But if you don't, the compiler will “guess” the signature itself – in this case this isn't all to remarkable since it can basically just copy&paste the signature of (+). And that's precisely what it will do.

...or at least what it should do. It does, provided you have the -XNoMonomorphism flag on. Otherwise, well, the dreaded monomorphism restriction steps in because f's definition is of the shape ConstantApplicativeForm = Value; that makes the compiler dumb down the signature to the next best non-polymorphic type it can find, namely Integer -> Integer -> Integer. To prevent this, you should in fact supply the right signature by hand, for all top-level functions. That also prevents a lot of confusion, and many errors become way less confusing.

The monomorphism restriction is the reason

f = succ

won't work on its own: because it also has this CAF shape, the compiler does not try to infer the correct polymorphic type, but tries to find some concrete instantiation to make a monomorphic signature. But unlike Num, the Enum class does not offer a default instance.

Possible solutions, ordered by preference:

  1. Always add signatures. You really should.
  2. Enable -XNoMonomorphismRestriction.
  3. Write your function definitions in the form f a = succ a, f a b = a+b. Because there are explicitly mentioned arguments, these don't qualify as CAF, so the monomorphism restriction won't kick in.
like image 22
leftaroundabout Avatar answered Oct 31 '22 06:10

leftaroundabout