Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why casting direction is big to small in primitive types and small to big with objects?

In Java we need casting when converting double(big in memory size) to Integer (smaller in memory size)

int x = (int) 4.3;

But in case of objects if parent class is "Mammal"(small in memory size) and its subclass is "Human" (big in memory size since got more properties then mammal)

then

Mammal m = new Human(); //works without casting

but small to big conversion

Human h = (Human) m ; // needs casting 

Thanks in Advance.

like image 542
Mr Coder Avatar asked Aug 12 '11 08:08

Mr Coder


2 Answers

Casting is not about the size of the object: it's about the range of the variable.

By 'range', i mean the variety of different values that the variable can contain. If you assign from one variable to another whose range is a superset of the first, you don't need to cast, because you know that the assignment will be okay. But when you assign from one variable to another whose range is a subset, you do need to cast, because the assignment might not be possible.

Imagine you have two containers: a plastic tub and a wire shopping basket, of the same size. Clearly, anything you can keep in the wire basket, you can keep in the tub. But not everything you can keep in the tub can be kept in the basket. A pile of apples, you can. But a pile of raisins, you can't - they would fall through the holes in the basket. So, the range of things that the tub can hold is greater than the range of things the basket can hold, even though both are the same size.

In that analogy, casting is like checking whether the thing you're moving will fit in the new container. You don't need to check when moving things from the basket to the tub, but you do need to check when moving from the tub to the basket, otherwise you will end up with fruit all over the floor.

In your specific cases, we know that every human is a mammal, but that not every mammal is a human, so the range of a variable of type Mammal is greater than that of a variable of type Human. We also know that the range of a double (approximately 2^1024 - -(2^1024)) is greater than that of an int (2^31-1 - -2^31). So, assigning from the former to the latter in either case requires a cast, but from the latter to the former does not.

like image 67
Tom Anderson Avatar answered Sep 21 '22 03:09

Tom Anderson


When you use primitive types, you have to explicitely cast when there is a chance you might loose information. For example, long is 64 bits and int is 32. Converting a long in an int can result in data loss (32 bits in this case).

When dealing with objects, this is relative to polymorphism. The compiler is able to ensure that every Human is a Mammal. No problem here. But it is unable to ensure that every Mammal is a Human. You have to cast explicitely to convert the reference type.

You can see explicit casts as a way of saying to the compiler "I know you can't ensure this data conversion is safe, but I know what I am doing".

like image 44
Vivien Barousse Avatar answered Sep 20 '22 03:09

Vivien Barousse