GHC Haskell seems to require a number in front of a decimal point in order to read a Double. Here's the code:
main :: IO ()
main = do
let d1 = read "0.3" :: Double
d2 = read ".3" :: Double
print d1
print d2
Running this produces:
0.3
*** Exception: Prelude.read: no parse
Is this a GHC bug or just a major limitation?
(I tried reading ".3" with C, Javascript and MS Excel, and all of these could successfully parse ".3" and understand it as a number. I think I'm seeing the effects of this issue in other areas of my program, including reading command line arguments with the parseargs package and reading Doubles in html forms with Yesod's MForms.)
Is there a known fix or work-around for this issue?
From the Haskell report:
2.5 Numeric Literals
A floating literal must contain digits both before and after the decimal point; this ensures that a decimal point cannot be mistaken for another use of the dot character
So this is the expected behaviour.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With