My question is: why the first code doesn't work very well but the second works perfectly?
First code:
function oddOrEven(N) {
if (N % 2 == 1) {
console.log('Weird');
} else if (2 <= N <= 5) {
console.log('Not Weird');
} else if (6 <= N <= 20) {
console.log('Weird');
} else if (N > 20) {
console.log('Not Weird');
}
}
Second code:
function oddOrEven(N) {
if (N % 2 == 1) {
console.log('Weird');
} else if (N >= 2 && N <= 5) {
console.log('Not Weird');
} else if (N >= 6 && N <= 20) {
console.log('Weird');
} else if (N > 20) {
console.log('Not Weird');
}
}
Taking 2 <= N <= 5
as an example, both of the following return true:
2 <= 3 <= 5
2 <= 6 <= 5
Empirically, it appears that the first inequality on the left is evaluated first, which is true in both cases. Then the following happens:
true <= 5
true <= 5
which really evaluates as:
1 <= 5
1 <= 5
both of which are true. Hence, you get a false flag true in the second example, even though you intend it to be logically false.
In practice, your second code snippet is how you would write such an inequality in JavaScript, and most other languages.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With