Although there is plentiful of links about this subject on SO, I think that there is something missing: a clear explanation in plain language of what are the differences between unspecified behavior (UsB), undefined behaviour (UB) and implementation-defined behavior (IDB) with detailed but easy explanation of any use-case and example.
Note: I made the UsB acronym up for sake of compactness in this WIKI, but don't expect to see it used elsewhere.
I know this may seem a duplicate of other posts (the one it comes closer is this), but before anyone marks this as a duplicate, please consider what are the problems with all the material I already found (and I'm going to make a community WIKI out of this post):
Too many scattered examples. Examples are not bad, of course, but sometimes one cannot find an example that nicely fits his problem at hand, so they may be confusing (especially for newbies).
Examples are often only code with few explanations. On such delicate matters, especially for (relative) newbies, a more top-down approach could be better: first a clear, simple explanation with an abstract (but not legalistic) description, then some simple examples with explanations on why they trigger some behavior.
Some posts often sport a mix of C and C++ examples. C and C++ sometimes are not in agreement of what they deem UsB, UB and IDB, so an example can be misleading for someone not proficient in both languages.
When a definition of UsB, UB, and IDB is given, usually it is a plain citation of the standards, which sometimes may be unclear or too difficult to digest for newbies.
Sometimes the citation of the standards are partial. Many posts cite the standard only for the parts that are useful for the problem at hand, which is good, but lacks generality. Moreover citation of the standards often is not accompanied by any explanation (bad for beginners).
Since I am not a super-expert on this subject myself, I will make a community WIKI so that anyone interested can contribute and improve the answer.
In order not to spoil my purpose to create a structured beginner-friendly WIKI, I'd like the posters to follow a couple of simple guidelines when editing the WIKI:
Categorize your use case. Try to put your example/code under an already existing category, if applicable, otherwise create a new one.
First the plain-words description. First describe with simple words (without oversimplifying, of course - quality first!) the example or the point you are trying to make. Then put code samples or citations.
Cite the standards by reference. Don't post snippets of various standards, but give clear references (e.g C99 WG14/N... section 1.4.7, paragraph ...) and post a link to the relevant resource, if possible.
Prefer free online resources. If you want to cite books or non-freely available resources that's ok (and may improve the quality of the WIKI), but try to add also some links to free resources. This is really important especially for ISO standards. You are welcome to add links to official standards, but try to add an equivalent link to freely available drafts as well. And please don't replace links to drafts with references to official standards, add to them. Even some Computer Science departments in some universities don't have a copy of the ISO standard(s), let alone most programmers at large!
Don't post code unless really necessary. Post code only if an explanation using only plain English would be awkward or unclear. Try to limit code samples to one-liners. Post links to other SO Q&A instead.
Don't post C++ examples. I'd like this to become a sort of FAQ for C (If someone wants to start a twin-thread for C++ that would be great, though). Relevant differences with C++ are welcome, but only as side-notes. That is after you explain the C case thoroughly you may add a couple of statements about C++ if this would help a C programmer when switching to C++, but I wouldn't want to see examples with more than, say, 20% C++ stuff. Usually a simple note like "(C++ behaves differently in this case)" plus a relevant link should be enough.
Since I'm fairly new to SO I hope I'm not breaking any rule by starting a Q&A this way. Sorry if this is the case. The mods are welcome to let me know about it.
Undefined Behavior results in unpredicted behavior of the entire program. But in unspecified behavior, the program makes choice at a particular junction and continue as usual like originally function executes.
So, in C/C++ programming, undefined behavior means when the program fails to compile, or it may execute incorrectly, either crashes or generates incorrect results, or when it may fortuitously do exactly what the programmer intended.
Implementation-defined behavior is defined by the ISO C Standard in section 3.4.1 as: unspecified behavior where each implementation documents how the choice is made. EXAMPLE An example of implementation-defined behavior is the propagation of the high-order bit when a signed integer is shifted right.
Undefined behavior can lead to security vulnerabilities in software. For example, buffer overflows and other security vulnerabilities in the major web browsers are due to undefined behavior. The Year 2038 problem is another example due to signed integer overflow.
C standards define UsB, UB and IDB in a way that can be summarized as follows:
This is a behavior for which the standard gives some alternatives among which the implementation must choose, but it doesn't mandate how and when the choice is to be made. In other words, the implementation must accept user code triggering that behavior without erroring out and must comply with one of the alternatives given by the standard.
Be aware that the implementation is not required to document anything about the choices made. These choices may also be non-deterministic or dependent (in an undocumented way) on compiler options.
To summarize: the standard gives some possibilities among which to choose, the implementation chooses when and how the specific alternative is selected and applied.
Note that the standard may provide a really large number of alternatives. The typical example is the initial value of local variables that are not explicitly initialized. The standard says that this value is unspecified as long as it is a valid value for the variable's data type.
To be more specific consider an int
variable: an implementation is free to choose any int
value, and this choice can be completely random, non-deterministic or be at the mercy of the whims of the implementation, which is not required to document anything about it. As long as the implementation stays within the limits stated by the standard this is ok and the user cannot complain.
As the naming indicates this is a situation in which the C standard doesn't impose or guarantee what the program would or should do. All bets are off. Such a situation:
renders a program either erroneous or nonportable
doesn't require absolutely anything from the implementation
This is a really nasty situation: as long as there is a piece of code that has undefined behavior, the entire program is considered erroneous and the implementation is allowed by the standard to do everything.
In other words, the presence of a cause of UB allows the implementation to completely ignore the standard, as long as the program triggering the UB is concerned.
Note that the actual behavior in this case may cover an unlimited range of possibilities, the following is by no means an exhaustive list:
I hope the last two (half-serious) items can give you the right gut-feeling about the nastiness of UB. And even though most implementations will not insert the necessary code to format you hard drive, real compilers do optimize!
Terminology Note: Sometimes people argue that some piece of code which the standard deems a source of UB in their implementation/system/environment work in a documented way, therefore it cannot be really UB. This reasoning is wrong, but it is a common (and somewhat understandable) misunderstanding: when the term UB (and also UsB and IDB) is used in a C context it is meant as a technical term whose precise meaning is defined by the standard(s). In particular the word "undefined" loses its everyday meaning. Therefore it doesn't make sense to show examples where erroneous or nonportable programs produce "well-defined" behavior as counterexamples. If you try, you really miss the point. UB means that you lose all the guarantees of the standard. If your implementation provides an extension then your guarantees are only those of your implementation. If you use that extension your program is no more a conforming C program (in a sense, it is no more a C program, since it doesn't follow the standard any longer!).
A common question about UB is something on these lines: "If UB is so nasty, why does not the standard mandate that an implementation issues an error when faced with UB?"
First, optimizations. Allowing implementations not to check for possible causes of UB allows lots of optimizations that make a C program extremely efficient. This is one of the features of C, although it makes C a source of many pitfalls for beginners.
Second, the existence of UB in the standards allows a conforming implementation to provide extensions to C without being deemed non-conforming as a whole.
As long as an implementation behaves as mandated for a conforming program, it is itself conforming, although it may provide non-standard facilities that may be useful on specific platforms. Of course the programs using those facilities will be nonportable and will rely on documented UB, i.e. behavior that is UB according to the standard, but that an implementation documents as an extension.
This is a behavior that can be described in a way similar to UsB: the standard provides some alternatives and the implementation choose one, but the implementation is required to document exactly how the choice is made.
This means that a user reading her compiler's documentation must be given enough information to predict exactly what will happen in the specific case.
Note that an implementation that doesn't fully document an IDB cannot be deemed conforming. A conforming implementation must document exactly what happens in any case that the standard declares IDB.
The order of evaluation for function arguments is unspecified EXP30-C.
For instance, in c(a(), b());
it is unspecified whether the function a
is called before or after b
. The only guarantee is that both are called before the c
function.
Null pointers are used to signal that a pointer does not point to valid memory. As such, it does not make much sense to try to read or write to memory via a null pointer.
Technically, this is undefined behaviour. However, since this is a very common source of bugs, most C-environments ensure that most attempts to dereference a null pointer will immediately crash the program (usually killing it with a segmentation fault). This guard is not perfect due to the pointer arithmetic involved in references to arrays and/or structures, so even with modern tools, dereferencing a null pointer may format your hard drive.
Just like null pointers, dereferencing a pointer before explitely setting its value is UB. Unlike for null pointers, most environments do not provide any safety net against this sort of error, except that compiler can warn about it. If you compile your code anyway, you'll are likely to experience the whole nastiness of UB.
An invalid pointer is a pointer that contains an address that is not within any allocated memory area. Common ways to create invalid pointers is to call free()
(after the call, the pointer will be invalid, which is pretty much the point of calling free()
), or to use pointer arithmetic to get an address that is beyond the limits of an allocated memory block.
This is the most evil variant of pointer dereferencing UB: There is no safety net, there is no compiler warning, there is just the fact that the code may do anything. And commonly, it does: Most malware attacks use this kind of UB behaviour in programs to make the programs behave as they want them to behave (like installing a trojan, keylogger, encrypting your hard drive etc.). The possibility of a formatted hard drive becomes very real with this kind of UB!
If we declare an object as const
, we give a promise to the compiler that we will never change the value of that object. In many contexts compilers will spot such an invalid modification and shout at us. But if we cast the constness away as in this snippet:
int const a = 42;
...
int* ap0 = &a; //< error, compiler will tell us
int* ap1 = (int*)&a; //< silences the compiler
...
*ap1 = 43; //< UB ==> program crash?
the compiler might not be able to track this invalid access, compile the code to an executable and only at run time the invalid access will be detected and lead to a program crash.
put your explanation here!
put your explanation here!
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With