Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Can I print in Haskell the type of a polymorphic function as it would become if I passed to it an entity of a concrete type?

Here's a function polymorphic in 3 types:

:t (.)
(.) :: (b -> c) -> (a -> b) -> a -> c

and here a non polymorphic function:

:t Data.Char.digitToInt
Data.Char.digitToInt :: Char -> Int

If we apply the former to the latter, we get a function polymorphic in 1 type:

:t (.) Data.Char.digitToInt
(.) Data.Char.digitToInt :: (a -> Char) -> a -> Int

which means that (.) was "instantiated" (I'm not sure this is the correct term; as a C++ programmer, I'd call it so) with b === Char and c === Int, so the signature of the (.) that gets applied to digitToInt is the following

(Char -> Int) -> (a -> Char) -> a -> Int

My question is: is there a way to have this signature printed on screen, given (.), digitToInt and the "information" that I want to apply the former to the latter?

For who's interested, this question was earlier closed as duplicate of this one.

like image 619
Enlico Avatar asked Dec 11 '20 20:12

Enlico


People also ask

What is polymorphic function in Haskell?

From HaskellWiki. A value is polymorphic if there is more than one type it can have. Polymorphism is widespread in Haskell and is a key feature of its type system. Most polymorphism in Haskell falls into one of two broad categories: parametric polymorphism and ad-hoc polymorphism.

What is it called when the type of a function contains one or more type variables?

A type that contains one or more type variables is called polymorphic.


3 Answers

There's this neat little function hidden in a corner of the Prelude:

Prelude.asTypeOf :: a -> a -> a
asTypeOf x _ = x

It's documented as "forcing its first argument to have the same type as the second." We can use this to force the type of (.)'s first argument:

-- (.) = \x -> (.) x = \x -> (.) $ x `asTypeOf` Data.Char.digitToInt
-- eta expansion followed by definition of asTypeOf
-- the RHS is just (.), but restricted to arguments with the same type as digitToInt
-- "what is the type of (.) when the first argument is (of the same type as) digitToInt?"
ghci> :t \x -> (.) $ x `asTypeOf` Data.Char.digitToInt
\x -> (.) $ x `asTypeOf` Data.Char.digitToInt
  :: (Char -> Int) -> (a -> Char) -> a -> Int

Of course, this works for as many arguments as you need.

ghci> :t \x y -> (x `asTypeOf` Data.Char.digitToInt) . (y `asTypeOf` head)
\x y -> (x `asTypeOf` Data.Char.digitToInt) . (y `asTypeOf` head)
  :: (Char -> Int) -> ([Char] -> Char) -> [Char] -> Int

You can consider this a variation of @K.A.Buhr's idea in the comments—using a function with a signature more restrictive than its implementation to guide type inference—except we don't have to define anything ourselves, at the cost of not being able to just copy the expression in question under a lambda.

like image 62
HTNW Avatar answered Oct 16 '22 14:10

HTNW


I think @HTNW's answer probably covers it, but for completeness, here's how the inContext solution works in detail.

The type signature of the function:

inContext :: a -> (a -> b) -> a

means that, if you have a thing you want to type, and a "context" in which it's used (expressible as a lambda that takes it as an argument), say with types:

thing :: a1
context :: a2 -> b

You can force unification of a1 (the general type of thing) with a2 (the constraints of the context) simply by constructing the expression:

thing `inContext` context

Normally, the unified type thing :: a would be lost, but the type signature of inContext implies that the type of this whole resulting expression will also be unified with the desired type a, and GHCi will happily tell you the type of that expression.

So the expression:

(.) `inContext` \hole -> hole digitToInt

ends up getting assigned the type that (.) would have within the specified context. You can write this, somewhat misleadingly, as:

(.) `inContext` \(.) -> (.) digitToInt

since (.) is as good an argument name for an anonymous lambda as hole is. This is potentially confusing, since we're creating a local binding that shadows the top-level definition of (.), but it's still naming the same thing (with a refined type), and this abuse of lambdas allowed us to write the original expression (.) digitToInt verbatim, with the appropriate boilerplate.

It's actually irrelevant how inContext is defined, if you're just asking GHCi for its type, so inContext = undefined would have worked. But, just looking at the type signature, it's easy enough to give inContext a working definition:

inContext :: a -> (a -> b) -> a
inContext a _ = a

It turns out that this is just the definition of const, so inContext = const works, too.

You can use inContext to type multiple things at once, and they can be expressions instead of names. To accommodate the former, you can use tuples; for the latter to work, you have use more sensible argument names in your lambas.

So, for example:

λ> :t (fromJust, fmap length) `inContext` \(a,b) -> a . b
(fromJust, fmap length) `inContext` \(a,b) -> a . b
  :: Foldable t => (Maybe Int -> Int, Maybe (t a) -> Maybe Int)

tells you that in the expression fromJust . fmap length, the types have been specialized to:

fromJust :: Maybe Int -> Int
fmap length :: Foldable t => Maybe (t a) -> Maybe Int
like image 29
K. A. Buhr Avatar answered Oct 16 '22 16:10

K. A. Buhr


Other answers require the help of functions that have been defined with artificially restricted types, such as the asTypeOf function in the answer from HTNW. This is not necessary, as the following interaction shows:

Prelude> let asAppliedTo f x = const f (f x)

Prelude> :t head `asAppliedTo` "x"
head `asAppliedTo` "x" :: [Char] -> Char

Prelude> :t (.) `asAppliedTo` Data.Char.digitToInt
(.) `asAppliedTo` Data.Char.digitToInt
  :: (Char -> Int) -> (a -> Char) -> a -> Int

This exploits the lack of polymorphism in the lambda-binding that is implicit in the definition of asAppliedTo. Both occurrences of f in its body must be given the same type, and that is the type of its result. The function const used here also has its natural type a -> b -> a:

const x y = x
like image 8
Mike Spivey Avatar answered Oct 16 '22 14:10

Mike Spivey