I was debugging something and discovered some strangeness in JavaScript:
alert(1=='') ==> false
alert(0=='') ==> true
alert(-1=='') ==> false
It would make sense that an implied string comparison that 0 should = '0'. This is true for all non-zero values, but why not for zero?
According to the Mozilla documentation on Javascript Comparison Operators
If the two operands are not of the same type, JavaScript converts the operands then applies strict comparison. If either operand is a number or a boolean, the operands are converted to numbers; if either operand is a string, the other one is converted to a string
What's actually happening is that the strings are being converted to numbers. For example:
1 == '1'
becomes 1 == Number('1')
becomes 1 == 1
: true
Then try this one:
1 == '1.'
becomes 1 == Number('1.')
becomes 1 == 1
: true
If they were becoming strings, then you'd get '1' == '1.'
, which would be false.
It just so happens that Number('') == 0
, therefore 0 == ''
is true
When javascript does implicit type conversions, the empty string literal will match the 0 integer. Do your comparison like this and you'll get your expected result:
alert(1==='') ==> false
alert(0==='') ==> false
alert(-1==='') ==> false
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With