Disclaimer: I’m just starting to learn Haskell and I’m not sure “strict” is the right word here.
I was trying to narrow down my problem but I couldn’t really find the issue, so here is my code that doesn’t compile:
module Json where
import Data.List (intersperse)
data JNode =
JObject [(String, JNode)]
| JArray [JNode]
| JString String
| JNumber Double
| JBool Bool
| JNull
instance Show JNode where
show = show_node 0 where
glue = foldl (++) ""
show_tabs n = glue $ take n $ repeat " "
show_list n = glue . intersperse ",\n" . map (show_pair (n + 1))
show_sect n l r xs = glue ["\n", tabs, l, "\n", show_list n xs, "\n", tabs, r] where tabs = show_tabs n
-- show_pair :: (Show a) => Int -> (a, JNode) -> String -- works when uncommented
show_pair n (name, val) = glue [show_tabs n, show name, " : ", show_node n val]
show_node n (JObject xs) = show_sect n "{" "}" xs
show_node n (JArray xs) = show_sect n "[" "]" $ zip [0..] xs
show_node n (JString x ) = show x
show_node n (JNumber x ) = show x
show_node n (JBool x ) = show x
show_node n (JNull ) = "null"
The error is:
Prelude> :l scripts\json.hs
[1 of 1] Compiling Json ( scripts\json.hs, interpreted )
scripts\json.hs:21:59:
No instance for (Enum String)
arising from the arithmetic sequence `0 .. '
In the first argument of `zip', namely `([0 .. ])'
In the second argument of `($)', namely `zip ([0 .. ]) xs'
In the expression: show_sect n "[" "]" $ zip ([0 .. ]) xs
scripts\json.hs:21:60:
No instance for (Num String) arising from the literal `0'
In the expression: 0
In the first argument of `zip', namely `[0 .. ]'
In the second argument of `($)', namely `zip [0 .. ] xs'
Failed, modules loaded: none.
Take a look at the line of code with the comment. Apparently, when there is no type declaration, it requires me to pass String
instead of just Show a
. Funny enough, it still requires name
to be a String
when I don’t even use it, e.g. when replacing show_pair
implementation with this:
show_pair n (name, val) = show_node n val
Can someone explain to me why it works the way it does?
Simplified version of my code with the same issue in case anyone was going to improve the answer:
data TFoo =
FooStr (String, TFoo)
| FooNum (Int, TFoo)
-- show_pair :: (a, TFoo) -> String
show_pair (_, val) = show_node val
show_node (FooStr x) = show_pair x
show_node (FooNum x) = show_pair x
tl;dr: always use explicit signatures when you want something to be polymorphic.
In System F (which is what Haskell's type system is based on) every type variable that's used for polymorphism needs to be explicitly quantified into scope by a type-level-lambda / for-all (∀). So in fact you'd need to have
show_pair :: ∀ a. (Show a) => Int -> (a, JNode) -> String
It could be argued that Haskell should require this too, but it doesn't. It just quantifies any variable that you mention in an explicit signature†, so you can simply write
show_pair :: (Show a) => Int -> (a, JNode) -> String
Also, it tries to introduce as many type variables as possible into top-level bindings without signature‡.
However, it does not automatically introduce any type variables into local bindings. Since the compiler knows exactly where As remarked in the comments, this isn't actually true because already since GHC-7.0, there's a thing called let generalisation which does make local bindings polymorphic even without signature. I wasn't aware that this is on by default. — Even if it's possible to omit the signatures, polymorphic functions should still better have them IMO.show_pair
is used, it has a full context of at least one type instantiation that you will need to have. What it assumes is that you only need one instantiation. With that it tries to infer some type for a monomorphic show_pair
, and fails. Only by adding an explicit signature do you force the compiler to consider a polymorphic signature.
†Provided you've not already introduced that variable in a higher scope, with the -XScopedTypeVariables
extension.
‡Unfortunately, the monomorphism restriction makes this awfully hard to rely on.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With