GHC accepts this code, but it ought to be illegal syntax(?) Any guesses as to what's going on?
module Tilde where
~ x = x + 2 -- huh?
~ x +++ y = y * 3 -- this makes sense
The (+++)
equation makes sense: it's declaring an operator, using infix syntax, and using an irrefutable pattern match on the first argument.
The first 'equation' looks like the same to start with. But there's no operator. If I ask
λ> :i ~
===> <interactive>:1:1: error: parse error on input `~'
λ> :i (~)
===> class (a ~ b) => (~) (a :: k) (b :: k)
-- Defined in `Data.Type.Equality'
instance [incoherent] forall k (a :: k) (b :: k). (a ~ b) => a ~ b
-- Defined in `Data.Type.Equality'
which is a bemusing discovery, but nothing to do with it(?) I can't define my own class or operator (~)
-- Illegal binding of built-in syntax
, not surprisingly.
Oh:
λ> :i x
===> x :: Integer -- GHCi defaulting, presumably
and trying to run x
loops for ever. So the strangeness is actually defining
x = x + 2
Then what's the ~
doing?
This tilde symbol is used on some special characters in Spanish and Portuguese. To use the tilde symbol on a character using latex, you need to use the \~ {} and ilde {} commands.
As the name suggests it is made of the “tilde” (U+007E, ~) character that separates two sequences of simple selectors. The elements represented by the two sequences share the same parent in the document tree.
If you are using a PostScript/Type1 font via a package such as mathptmx or lm: Set the font encoding to T1 (via \usepackage [T1] {fontenc}) and use extasciitilde. If you are using a TTF or OTF font via XeTeX or LuaTeX: Use \char`~ to insert a tilde.
The point is – the tilde character is definitely an existing glyph in any font I am tempted to use. How can I just access and typeset that character, please? Without resorting to some other, similar-looking glyph that first needs to be nudged into place using a cunning combination of font size and raised boxes?
Writing
x = 5
creates a global variable named x
, bound to the value 5
. Adding a tilde makes the pattern match irrefutable, but it was already irrefutable, so that doesn't make much sense. But it's legal to write something like
(xs, ys) = span odd [1..10]
This defines two global variables, xs
and ys
, containing all the odd numbers and all the even numbers between 1 and 10. You could even make this irrefutable if you want by adding a tilde. Of course, this pattern can't fail (if the expression is well-typed), so there's no point to that. But consider:
~(x:xs) = filter odd [1..10]
This defines two global variables, x
and xs
, if the filter returns at least one result. If the filter were to return zero results, the pattern match would fail. (In practice, this means that accessing x
or xs
would throw a pattern match failure exception.)
You can even write utterly bizarre stuff like
False = True
This seemingly nonsensical declaration pattern-matches the pattern False
against the value True
, and does nothing either way. It's one of those obscure corners of the language.
The tilde is doing exactly what it did in your other example: it makes the pattern irrefutable (so the pattern match can not fail). The pattern already was irrefutable, of course, in both cases (being a plain variable, which always matches), but that doesn't make the tilde illegal, just unnecessary.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With