Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Does casting actually DO anything?

Tags:

c

casting

Consider the following snippet:

char x[100];
double *p = &x;

As expected, this yields this warning:

f.c:3:15: warning: initialization of ‘double *’ from incompatible pointer type ‘char (*)[100]’ 
[-Wincompatible-pointer-types]
    3 |   double *p = &x;
      |               ^

This is very easy to solve by just changing to

double *p = (double*)&x;

My question here is, does the casting actually DO anything? Would the code be invalid without the cast? Or is it just a way to make the compiler quiet? When is casting necessary?

I know that you can have some effect with snippets like this:

int x = 666;
int y = (char)x;

But isn't this the same as this?

int x = 666;
char c = x;
int y = c;

If it is the same, then the casting does something, but it's not necessary. Right?

Please help me understand this.

like image 501
klutt Avatar asked Feb 01 '21 22:02

klutt


People also ask

Is Casting necessary?

Casting is required when information might be lost in the conversion, or when the conversion might not succeed for other reasons. Typical examples include numeric conversion to a type that has less precision or a smaller range, and conversion of a base-class instance to a derived class.

Is type casting bad?

Overall, being “type-cast” is not necessarily a bad thing, in fact it helps Hollywood produce quality movies. Actors and actresses hone a craft and make a certain type of character and role theirs and it improves the movies they make.

Should casting be avoided in Java?

It's long been recognized that casting can be detrimental to performance in Java, and that you can improve performance by minimizing casting in heavily used code. Method calls, especially calls through interfaces, are also often mentioned as potential performance bottlenecks.

What does casting do in C?

Type Casting is basically a process in C in which we change a variable belonging to one data type to another one. In type casting, the compiler automatically changes one data type to another one depending on what we want the program to do.


4 Answers

The cast does at least 1 thing - it satisfies the following constraint on assignment:

6.5.16.1 Simple assignment

Constraints

1    One of the following shall hold:112)
...
— the left operand has atomic, qualified, or unqualified pointer type, and (considering the type the left operand would have after lvalue conversion) both operands are pointers to qualified or unqualified versions of compatible types, and the type pointed to by the left has all the qualifiers of the type pointed to by the right;
112) The asymmetric appearance of these constraints with respect to type qualifiers is due to the conversion (specified in 6.3.2.1) that changes lvalues to ‘‘the value of the expression’’ and thus removes any type qualifiers that were applied to the type category of the expression (for example, it removes const but not volatile from the type int volatile * const).

That's a compile-time constraint - it affects whether or not the source code is translated to an executable, but it doesn't necessarily affect the translated machine code.

It may result in an actual conversion being performed at runtime, but that depends on the types involved in the expression and the host system.

like image 34
John Bode Avatar answered Oct 16 '22 09:10

John Bode


Casting can do several different things. As other answers have mentioned, it almost always changes the type of the value being cast (or, perhaps, an attribute of the type, such as const). It may also change the numeric value in some way. But there are many possible interpretations:

  • Sometimes it merely silences a warning, performing no "real" conversion at all (as in many pointer casts).
  • Sometimes it silences a warning, leaving only a type change but no value change (as in other pointer casts).
  • Sometimes the type change, although it involves no obvious value change, implies very different semantics for use of the value later (again, as in many pointer casts).
  • Sometimes it requests a conversion which is meaningless or impossible.
  • Sometimes it performs a conversion that the compiler would have performed by itself (with or without a warning).
  • But sometimes it forces a conversion that the compiler wouldn't have performed.

Also, sometimes those warnings that the compiler tried to make, that a cast silences, are innocuous and/or a nuisance, but sometimes they're quite real, and the code is likely to fail (that is, as the silenced warning was trying to tell you).

For some more specific examples:

A pointer cast that changes the type, but not the value:

char *p1 = ... ;
const char *p2 = (const char *)p;

And another:

unsigned char *p3 = (unsigned char *)p;

A pointer cast that changes the type in a more significant way, but that's guaranteed to be okay (on some architectures this might also change the value):

int i;
int *ip = &i;
char *p = (char *)ip;

A similarly significant pointer cast, but one that's quite likely to be not okay:

char c;
char *cp = &c;
int *ip = (int *)cp;
*ip = 5;                  /* likely to fail */

A pointer cast that's so meaningless that the compiler refuses to perform it, even with an explicit cast:

float f = 3.14;
char *p = (char)f;        /* guaranteed to fail */

A pointer cast that makes a conversion, but one that the compiler would have made anyway:

int *p = (int *)malloc(sizeof(int));

(This one is considered a bad idea, because in the case where you forget to include <stdlib.h> to declare malloc(), the cast can silence a warning that might alert you to the problem.)

Three casts from an integer to a pointer, that are actually well-defined, due to a very specific special case in the C language:

void *p1 = (void *)0;
char *p2 = (void *)0;
int *p3 = (int *)0;

Two casts from integer to pointer that are not necessarily valid, although the compiler will generally do something obvious, and the cast will silence the otherwise warning:

int i = 123;
char *p1 = (char *)i;
char *p2 = (char *)124;
*p1 = 5;                  /* very likely to fail, except when */
*p2 = 7;                  /* doing embedded or OS programming */

A very questionable cast from a pointer back to an int:

char *p = ... ;
int i = (int)p;

A less-questionable cast from a pointer back to an integer that ought to be big enough:

char *p = ... ;
uintptr_t i = (uintptr_t)p;

A cast that changes the type, but "throws away" rather than "converting" a value, and that silences a warning:

(void)5;

A cast that makes a numeric conversion, but one that the compiler would have made anyway:

float f = (float)0;

A cast that changes the type and the interpreted value, although it typically won't change the bit pattern:

short int si = -32760;
unsigned short us = (unsigned short)si;

A cast that makes a numeric conversion, but one that the compiler probably would have warned about:

int i = (int)1.5;

A cast that makes a conversion that the compiler would not have made:

double third = (double)1 / 3;

The bottom line is that casts definitely do things: some of them useful, some of them unnecessary but innocuous, some of them dangerous.

These days, the consensus among many C programmers is that most casts are or should be unnecessary, meaning that it's a decent rule to avoid explicit casts unless you're sure you know what you're doing, and it's reasonable to be suspicious of explicit casts you find in someone else's code, since they're likely to be a sign of trouble.


As one final example, this was the case that, back in the day, really made the light bulb go on for me with respect to pointer casts:

char *loc;
int val;
int size;

/* ... */

switch(size) {
    case 1: *loc += val; break;
    case 2: *(int16_t *)loc += val; break;
    case 4: *(int32_t *)loc += val; break;
}

Those three instances of loc += val do three pretty different things: one updates a byte, one updates a 16-bit int, and one updates a 32-bit int. (The code in question was a dynamic linker, performing symbol relocation.)

like image 183
Steve Summit Avatar answered Oct 16 '22 10:10

Steve Summit


Casting changes the type, which can be very important when signed or unsigned type matters,

For example, character handling functions such as isupper() are defined as taking an unsigned char value or EOF:

The header <ctype.h> declares several functions useful for classifying and mapping characters. In all cases the argument is an int, the value of which shall be representable as an unsigned char or shall equal the value of the macro EOF. If the argument has any other value, the behavior is undefined.

Thus code such as

int isNumber( const char *input )
{
    while ( *input )
    {
        if ( !isdigit( *input ) )
        {
             return( 0 );
        }
        input++;
    }
    // all digits
    return( 1 );
}

should properly cast the const char value of *input to unsigned char:

int isNumber( const char *input )
{
    while ( *input )
    {
        if ( !isdigit( ( unsigned char ) *input ) )
        {
             return( 0 );
        }
        input++;
    }
    // all digits
    return( 1 );
}

Without the cast to unsigned char, when *input is promoted to int, an char value (assuming char is signed and smaller than int) that is negative will be sign-extended to a negative value that can not be represented as an unsigned char value and therefore invoke undefined behavior.

So yes, the cast in this case does something. It changes the type and therefore - on almost all current systems - avoids undefined behavior for input char values that are negative.

There are also cases where float values can be cast to double (or the reverse) to force code to behave in a desired manner.*

* - I've seen such cases recently - if someone can find an example, feel free to add your own answer...

like image 22
Andrew Henle Avatar answered Oct 16 '22 10:10

Andrew Henle


The cast may or may not change the actual binary value. But that is not its main purpose, just a side effect.

It tells the compiler to interpret a value as a value of a different type. Any changing of binary value is a side effect of that.

It is for you (the programmer) to let the compiler know: I know what I'm doing. So you can shoot yourself in the foot without the compiler questioning you.

Don't get me wrong, cast are absolutely necessary in real world code, but they must be used with care and knowledge. Never cast just to get rid of a warning, make sure you understand the consequences.

like image 2
koder Avatar answered Oct 16 '22 11:10

koder