I was expecting that in my following code:
#include<stdio.h> int main(){ int i = 10; int j = 10; j = ++(i | i); printf("%d %d\n", j, i); j = ++(i & i); printf("%d %d\n", j, i); return 1; }
expressions j = ++(i | i);
and j = ++(i & i);
will produce lvalue errors as below:
x.c: In function ‘main’: x.c:6: error: lvalue required as increment operand x.c:9: error: lvalue required as increment operand
But I surprised that above code compiled successfully, as below:
~$ gcc x.c -Wall ~$ ./a.out 11 11 12 12
Check the above code working correctly.
While other operators produce error (as I understand). Even bitwise operator XOR causes of an error j = ++(i ^ i);
(check other operators produce an lvalue error at compilation time).
What is the reason? Is this is unspecified or undefined ? or bitwise OR AND operators are different?
compiler version:
gcc version 4.4.5 (Ubuntu/Linaro 4.4.4-14ubuntu5)
But I believe compiler version shouldn't reason for non-uniform behavior. If ^
not compiled then |
and &
also not. otherwise should work for all
Its not an error with this compiler in c99 mode: gcc x.c -Wall -std=c99
.
JavaScript's expression is a valid set of literals, variables, operators, and expressions that evaluate to a single value that is an expression. This single value can be a number, a string, or a logical value as depending on expression. The Complete List of JavaScript Expressions are listed below: Primary Expressions.
An expression has 3 parts: constant, variable, and term. There are 3 types of expressions: arithmetic/numerical, fractional, and algebraic.
For example, // expression number = 10 // statement number = 10; In the above example, we have an expression number = 10 . Here, by adding a semicolon ( ; ), we have converted the expression into a statement ( number = 10; ).
There are three kinds of expressions: An arithmetic expression evaluates to a single arithmetic value. A character expression evaluates to a single value of type character. A logical or relational expression evaluates to a single logical value.
You are right that it should not compile, and on most compilers, it does not compile.(Please specify exactly which compiler/version is NOT giving you a compiler error)
I can only hypothesize that the compiler knows the identities that (i | i) == i
and (i & i) == i
and is using those identities to optimize away the expression, just leaving behind the variable i
.
This is just a guess, but it makes a lot of sense to me.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With