Do C and C++ compilers generally optimize comparisons with functions?
For example, this page suggests that the size
function on std::lists in C++ can have a linear complexity O(N) in some standard library implementations (which makes sense for a linked list).
But in that case, if myList
is a huge list, what would something like this do?
if (myList.size() < 5) return 1;
else return 2;
Would the size() function find and count all N list members, or would it be optimized to short circuit after finding 5 members?
Syntax. iteration-statement: do statement while ( expression ) ; The expression in a do-while statement is evaluated after the body of the loop is executed. Therefore, the body of the loop is always executed at least once. The expression must have arithmetic or pointer type.
Using the do-while loop, we can repeat the execution of several parts of the statements. The do-while loop is mainly used in the case where we need to execute the loop at least once. The do-while loop is mostly used in menu-driven programs where the termination condition depends upon the end user.
The do/while loop is a variant of the while loop. This loop will execute the code block once, before checking if the condition is true, then it will repeat the loop as long as the condition is true.
Example 2: do...while loop The do...while loop executes at least once i.e. the first iteration runs without checking the condition. The condition is checked only after the first iteration has been executed. do { printf("Enter a number: "); scanf("%lf", &number); sum += number; } while(number != 0.0);
Theoretically the possibility exists if size()
was inlined, but to perform the optimization the compiler would have to
That's a big bunch of things to count on IMHO, and it includes features which are not "obviously useful" in other contexts as well. Keep in mind that compiler vendors have limited resources so there has to be really good justification for implementing these prerequisites and having the compiler bring all the parts together to optimize this case.
Seeing as even if this is a perf issue for someone the problem can be easily solved in code, I don't feel that there would be such justification. So no, generally you should not expect cases like this to be optimized.
Actually, in C++11, std::list
is optimized and size()
is returned in constant time.
For C++03, size()
indeed operates in linear time, as it needs to count the elements each time.
Would the size() function find and count all N list members, or would it be optimized to short circuit after finding 5 members?
Never seen this sort of optimization happening in practice. While it's certainly legal, I doubt there's any compiler that actually implements something like this.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With