Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Function return type not enforced

I would be very thankful if someone patient enough explained to me the situation below. It seems to me as if Haskell were ready to perform some kind of integral type coercion when returning a value from a function. On the other hand, I have read that Haskell does never convert a type implicitly.

If I type in GHCi:

> import Data.Word
> let t :: (Integer, Word32); 
      t = let { y = fromIntegral (-1)     -- line breaks added for readability
              ; y' :: Integer 
              ; y' = fromIntegral y } in (y', y)

GHCi tells me later that t = (-1,4294967295). But if I constrain the local y type specifically to Word32:

> let t :: (Integer, Word32); 
      t = let { y :: Word32
              ; y = fromIntegral (-1)     -- line breaks added for readability
              ; y' :: Integer
              ; y' = fromIntegral y } in (y', y)

GHCi will tell me that t = (4294967295,4294967295).

I thought that if t's type is stated explicite as (Integer, Word32), GHCi will conclude that y' :: Integer and y :: Word32 since the function result is (y', y). Then, the type definition y :: Word32 would be completely unnecessary.

This all started when I tried to write a function to "safely" convert between Integral class members - e.g. Int -> Word32. That function was meant to return Just 1 when passed 1 and Nothing when passed -1.

A short search through SO and internet overall didn't provide me with any explanation.

like image 204
Adam Głowacki Avatar asked Dec 23 '22 15:12

Adam Głowacki


2 Answers

I thought that if t's type is stated explicite as (Integer, Word32), GHCi will conclude that y' :: Integer and y :: Word32 since the function result is (y', y).

No, it does not infer anything about the types of y' and y. It just checks whether they are compatible with the expected type. A simpler example1:

x = -1

y :: Integer
y = x

y' :: Word32
y' = x

Which type does x have? It's neither Integer nor Word32. It's the type of literals:

x :: Num a => a
x = fromInteger (-1 :: Integer)

And Num a => a is compatible both with the usage as an Integer in y and as a Word32 in y'. It doesn't matter how x is used, the type of x is determined only by how the term is defined.


To explain your results, remember referential transparency - we can just replace the variables with their definition:

t = let y = fromIntegral (-1)
        y' = (fromIntegral y) :: Integer
    in (y', y) :: (Integer, Word32)

expands to

t = ( (fromIntegral (fromIntegral (-1))) :: Integer -- nothing says Word32 here
    , fromIntegral (-1)
    ) :: (Integer, Word32)

whereas the second

t = let y = (fromIntegral (-1)) :: Word32
        y' = (fromIntegral y) :: Integer
     in (y', y) :: (Integer, Word32)

expands to

t = ( (fromIntegral ( (fromIntegral (-1)) :: Word32 )) :: Integer
    , (fromIntegral (-1)) :: Word32
    ) :: (Integer, Word32)

1: I hope the dreaded monomorphism restriction doesn't mess with us here. Can anyone more knowledgeable confirm that it doesn't apply to x (or under what circumstances)?

like image 156
Bergi Avatar answered Jan 03 '23 16:01

Bergi


The result depends on the -X[No]MonomorphismRestriction option. By default, the -XNoMonomorphismRestriction option is used in GHCi, so GHCi inferred a type for y :: Num a => a. And in the expression fromIntegral y it instantiated as Integer (by default instance for Num class), in the expression (y', y) it instantiated as Word32. When you explicitly specified type of y, the variable y was Word32 in all the expression.

If you run the command :set -XMonomorphismRestriction, all variants will get the same result.

like image 43
freestyle Avatar answered Jan 03 '23 16:01

freestyle