I'm reading OCP Java SE7, certification guide from Mala Gupta. On page 297, the following code snippet
import java.util.HashMap;
import java.util.Map;
public class TestGenericTypeInference {
Map<String,Double> salaryMap = new HashMap<>();
Map<String,Object> copySalaryMap = new HashMap<>(salaryMap);
}
is compiling with java 8 but with java 7 the compiler complains:
TestGenericTypeInference.java:8: error: incompatible types: HashMap<String,Double> cannot be converted to Map<String,Object>
Map<String,Object> copySalaryMap = new HashMap<>(salaryMap);
^
My question is : What change in type inference algorithm causes this behavior?
Type inference is the ability to automatically deduce, either partially or fully, the type of an expression at compile time. The compiler is often able to infer the type of a variable or the type signature of a function, without explicit type annotations having been given.
Type inference refers to the process of determining the appropriate types for expressions based on how they are used. For example, in the expression f 3 , OCaml knows that f must be a function, because it is applied to something (not because its name is f !) and that it takes an int as input.
Type inference means that if you don't tell the compiler the type, but it can decide by itself, it will decide. The compiler always needs to know the type of the variables, but you don't always need to tell it. Actually, usually you don't need to tell it. For example, for let my_number = 8 , my_number will be an i32 .
The beauty of type inference is that it's always optional; you can still specify a different type if you need one. Regarding the Entry example: good point, I'll replace it with Map. Entry<Integer, Map<Integer, SomeObject<SomeObject, T>>> .
The answer to my question:
What change in type inference algorithm causes this behavior?
is in the Generics FAQ from Angelina Langer. A similar example is given:
// error in Java 7 ; fine since Java 8
Set<Number> s3 = new HashSet<>(Arrays.asList(0L,0L));
- The [...] expression demonstrates that the lefthand side of the assignment is indeed ignored (in Java 7). The compiler again infers from the constructors argument, i.e., the result of the asList method, that the missing type parameter for the new HashSet must be Long . This leads to a type mismatch and an according error message. The compiler does not conclude that the missing type parameter should be Number because it ignores the lefthand side of the assignment. In Java 8, the type inference was modified and improved. Since then, compiler infers Number as the type parameter form the new HashSet on the right-hand side of the compiler and from that deduces Number as the type parameter for the asList method. In Java 8, this compiles just fine.
I think it is described in JLS 8, ch 18.2.1:
By treating nested generic method invocations as poly expressions, we improve the behavior of inference for nested invocations. For example, the following is illegal in Java SE 7 but legal in Java SE 8:
ProcessBuilder b = new ProcessBuilder(Collections.emptyList()); // ProcessBuilder's constructor expects a List<String>
When both the outer and the nested invocation require inference, the problem is more difficult. For example:
List<String> ls = new ArrayList<>(Collections.emptyList());
Our approach is to "lift" the bounds inferred for the nested invocation (simply
{ α <: Object }
in the case ofemptyList
) into the outer inference process (in this case, trying to infer β where the constructor is for typeArrayList<β>
). We also infer dependencies between the nested inference variables and the outer inference variables (the constraint‹List<α> → Collection<β>
› would reduce to the dependencyα = β
). In this way, resolution of the inference variables in the nested invocation can wait until additional information can be inferred from the outer invocation (based on the assignment target,β = String
).
This example List<String> ls = new ArrayList<>(Collections.emptyList());
also isn't compiled in java-7.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With