I'm writing a regex to be used with JavaScript. When testing I came across some strange behavior and boiled it down to the following:
/^[a-z]/.test("abc"); // <-- returns true as expected
/^[a-z]/.test(null); // <-- returns true, but why?
I was assuming that the last case was going to return false
since it does not meet the regex (the value is null and thus, do no start with a character in the range). So, can anyone explain me why this is not the case?
If I do the same test in C#:
var regex = new Regex("^[a-z]");
var res = regex.IsMatch(null); // <-- ArgumentNullException
... I get an ArgumentNullException
which makes sense. So, I guess when testing a regex in JavaScript, you have to manually do a null
check?
I have tried searching for an explanation, but without any luck.
Here null
is getting typecasted to String
form which is "null"
.
And "null"
matches your provided regex which is why it is evaluating to true
In Javascript, everything(or mostly) is an Object
which has ToString
method which will be automatically called upon internally in case there is a need for a typecast.
That's because test
converts its argument : null
is converted to the "null"
string.
You can check that in the console :
/^null$/.test(null)
returns true
.
The call to ToString(argument)
is specified in the ECMAScript specification (see also ToString).
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With