Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why can the haskell compiler infer this type but ghci cannot?

Tags:

haskell

ghci

I'm working through the following book to learn Haskell - looking in particular at the chapter on randomness:

I'm running the following as the file three-coins.hs:

import System.Random 

threeCoins :: StdGen -> (Bool, Bool, Bool)  
threeCoins gen =   
    let (firstCoin, newGen) = random gen
        (secondCoin, newGen') = random newGen
        (thirdCoin, newGen'') = random newGen'
    in  (firstCoin, secondCoin, thirdCoin)

main = print ( threeCoins (mkStdGen 21) )

Which I then execute with runhaskell three-coins.hs and get output similar to:

(True,True,True)

Now they make the point in the notes:

Notice that we didn't have to do random gen :: (Bool, StdGen). That's because we already specified that we want booleans in the type declaration of the function. That's why Haskell can infer that we want a boolean value in this case.

That is cool.

Now when I run this in ghci with the following code:

import System.Random 

:{
threeCoins :: StdGen -> (Bool, Bool, Bool)
threeCoins gen =
    let (firstCoin, newGen) = random gen
        (secondCoin, newGen') = random newGen
        (thirdCoin, newGen'') = random newGen'
    in  (firstCoin, secondCoin, thirdCoin)
:}

I get the following response:

<interactive>:6:9: error:
    • Ambiguous type variable ‘t0’
      prevents the constraint ‘(Random t0)’ from being solved.
    • When checking that the inferred type
        newGen :: forall t. Random t => StdGen
      is as general as its inferred signature
        newGen :: StdGen
      In the expression:
        let
          (firstCoin, newGen) = random gen
          (secondCoin, newGen') = random newGen
          (thirdCoin, newGen'') = random newGen'
        in (firstCoin, secondCoin, thirdCoin)
      In an equation for ‘threeCoins’:
          threeCoins gen
            = let
                (firstCoin, newGen) = random gen
                (secondCoin, newGen') = random newGen
                ....
              in (firstCoin, secondCoin, thirdCoin)

Which is interesting. Somewhat like the error they were warning us about in the book.

So if we modify the code to put the type hints in:

import System.Random 

:{
threeCoins :: StdGen -> (Bool, Bool, Bool)
threeCoins gen =
    let (firstCoin, newGen) = random gen :: (Bool, StdGen)
        (secondCoin, newGen') = random newGen :: (Bool, StdGen)
        (thirdCoin, newGen'') = random newGen' :: (Bool, StdGen)
    in  (firstCoin, secondCoin, thirdCoin)
:}

That works fine - and we can test it with the following:

threeCoins (mkStdGen 21) 

and get this result

(True,True,True)

Huh - that worked. So the Haskell compiler can infer from the type that we provided that we want a boolean, but ghci can't.

My question is: Why can the haskell compiler infer this type but ghci cannot?

like image 366
hawkeye Avatar asked Dec 28 '17 11:12

hawkeye


People also ask

What is the difference between GHC and Ghci?

Introduction. GHCi is GHC's interactive environment, in which Haskell expressions can be interactively evaluated and programs can be interpreted.

How do I run Ghci in terminal?

If you have installed the Haskell Platform, open a terminal and type ghci (the name of the executable of the GHC interpreter) at the command prompt. Alternatively, if you are on Windows, you may choose WinGHCi in the Start menu. And you are presented with a prompt. The Haskell system now attentively awaits your input.


1 Answers

As chi already commented, this code only works when the monomorphism restriction is enabled. That restriction makes the compiler choose one specific type for any non-function definition, i.e. a signature with no type variables in it like the a in length :: [a] -> Int. So (unless you've manually specified a local signature) the compiler looks around everywhere for a hint what this type could be, before it chooses. In your example, it sees that firstCoin secondCoin thirdCoin are used in the final result which at the top-level signature is declared (Bool, Bool, Bool), so it infers that all the coins must have type Bool.

That's nice and fine in such a simple example, but in modern Haskell, you very often need values to be more general, so you can use them in multiple different-typed contexts or as arguments for Rank-2 function. You can always achieve this by giving explicit signatures, but especially in GHCi that's awkward (it's regularly called “the Dreaded Monomorphism Restriction”), therefore it was decided a couple of versions ago to disable it by default in GHCi.

Conceptually, firstCoin secondCoin thirdCoin etc. could also be more general than Bool: random is after all able to yield random values of any suitable type (i.e. any type that has a Random instance). So in principle, the local definitions could have a polymorphic type, like this:

threeCoins :: StdGen -> (Bool, Bool, Bool)  
threeCoins gen =   
    let firstCoin, secondCoin, thirdCoin :: Random r => r
        (firstCoin, newGen) = random gen
        (secondCoin, newGen') = random newGen
        (thirdCoin, newGen'') = random newGen'
    in  (firstCoin, secondCoin, thirdCoin)

which is basically what happens when the monomorphism restriction is turned off, as you can see by compiling your original example with the line

{-# LANGUAGE NoMonomorphismRestriction #-}

on top.

Trouble is, your code doesn't actually work with those general local signatures. The reason is a bit involved, it's basically that the type information of the r variable has to be propagated back into a tuple before it can be used in the random generator, and for reasons that I right now don't understand either, the Hindley-Milner type system can't do that.

The best solution is to not do this manual tuple unwrapping, which is anyways awkward, but instead use a random monad, like this:

import System.Random 
import Data.Random 

threeCoins :: RVar (Bool, Bool, Bool)  
threeCoins = do   
    firstCoin <- uniform False True
    secondCoin <- uniform False True
    thirdCoin <- uniform False True
    return (firstCoin, secondCoin, thirdCoin)

main = print . sampleState threeCoins $ mkStdGen 21

which works with or without momomorphism restriction, because firstCoin secondCoin thirdCoin now come from a monadic bind, which is always monomorphic.

Incidentally, because you're in a monad then you can use standard combinators and thus easily shorten it to

import Control.Monad (replicateM)

threeCoins :: RVar (Bool, Bool, Bool)  
threeCoins = do   
    [firstCoin,secondCoin,thirdCoin] <- replicateM 3 $ uniform False True
    return (firstCoin, secondCoin, thirdCoin)
like image 98
leftaroundabout Avatar answered Sep 27 '22 18:09

leftaroundabout