Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Haskell type error: Could not deduce (Show a) arising from a use of `show' from the context (Num a)

I am configuring xmonad, and since I have to start a couple of dzen instances, I decided that it could be better to use a function that takes the parameters for x and y position, width, height and text align:

-- mydzen.hs

import Data.List

-- | Return a string that launches dzen with the given configuration.
myDzen :: Num a  => a -> a -> a -> Char -> String -> String

myDzen y x w ta e =
        intercalate " "
            [ "dzen2"
            , "-x"  , show x
            , "-w"  , show w
            , "-y"  , show y
            , "-h"  , myHeight
            , "-fn" , quote myFont
            , "-bg" , quote myDBGColor
            , "-fg" , quote myFFGColor
            , "-ta" , [ta]
            , "-e"  , quote e
            ]

quote :: String -> String
quote x = "'" x "'"

-- dummy values
myHeigth = "20"
myFont = "bitstream"
myDBGColor = "#ffffff"
myFFGColor = "#000000"

Could not deduce (Show a) arising from a use of `show'
from the context (Num a)
  bound by the type signature for
             myDzen :: Num a => a -> a -> a -> Char -> String -> String
  at mydzen.hs:(5,1)-(17,13)
Possible fix:
  add (Show a) to the context of
    the type signature for
      myDzen :: Num a => a -> a -> a -> Char -> String -> String
In the expression: show x
In the second argument of `intercalate', namely
  `["dzen2", "-x", show x, "-w", ....]'
In the expression:
  intercalate " " ["dzen2", "-x", show x, "-w", ....]

Obviously, deleting the signature, or changing Num a for Show a solves the issue, but I can't understand why. The 'arguments' x, w and y are supposed to be almost any kind of numbers (100, 550.2, 1366 * 0.7 ecc).

I'm new to haskell and until now I haven't been able to (clearly) understand the error or find what's wrong.

like image 746
Pablo Olmos de Aguilera C. Avatar asked Jun 25 '12 16:06

Pablo Olmos de Aguilera C.


3 Answers

Previously, Show and Eq were superclasses of Num and the code would compile.

In the new GHC, from version 7.4, this has changed, now Num does not depend on Show and Eq, so you need to add them to signatures.

The reason is separation of concerns - there are numeric types with no sensible equality and show function (computable reals, rings of functions).

like image 60
sdcvvc Avatar answered Nov 16 '22 06:11

sdcvvc


Not all numbers are showable (standard types like Int or Integer are, but you could have defined your own type; how would compiler know?) You’re using show x in your function — therefore, a must belong to Show typeclass. You can change your signature to myDzen :: (Num a, Show a) => a -> a -> a -> Char -> String -> String and get rid of the error.

like image 43
Artyom Avatar answered Nov 16 '22 06:11

Artyom


show is not part of the Num type class — to be able to use it, you have to add Show constraint alongside the Num one:

myDzen :: (Num a, Show a) => a -> a -> a -> Char -> String -> String
like image 3
Cat Plus Plus Avatar answered Nov 16 '22 07:11

Cat Plus Plus