This is blowing my mind.
All I want to do is basic string comparison on a long varchar field.
I have a table of approx. 12M records.
If I query for MY_FIELD='a string', I get a count of 25947, which seems about right.
If I query for MY_FIELD!='a string', I get a count of 989.
Shouldn't these 2 counts add up to the full table size of 12M?
And in how many of those rows is MY_FIELD set to NULL?
a. select count(*) from mytable;
b. select count(*) from mytable where my_field is null;
c. select count(*) from mytable where my_field is not null;
d. select count(*) from mytable where my_field = 'some value';
e. select count(*) from mytable where my_field != 'some value';
NULL is not equal or unequal to any value, including NULL so I would expect d+e to equate to c and b+c to equate to a.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With