Why certain code patterns when present within JVM internal classes are turned into an intrinsic function, whereas the same patterns when called from my own class are not.
Example:
bitCount function, when called from within Integer.bitCount(i) will be turned into an intrinsic. But when copied into my class and then called will take much longer to execute.
Compare
Integer.bitCount(i) MyClass.bitCount(i) public static int bitCount(int i) { // HD, Figure 5-2 i = i - ((i >>> 1) & 0x55555555); i = (i & 0x33333333) + ((i >>> 2) & 0x33333333); i = (i + (i >>> 4)) & 0x0f0f0f0f; i = i + (i >>> 8); i = i + (i >>> 16); return i & 0x3f; }
The answer is simple: an intrinsic function is defined in this way because a faster, native way to obtain the result of the function exists and it is applied in case thanks to a specified mapping.
That's not something related to compilation at all. Integer.bitCount
is special in the sense that implementation is marked to be replaceable with a native asm instruction POPCNT. Basically this native instruction is used when using the Integer.bitCount
function (if the CPU supports that instruction), when you declare your own copy of the function the normal implementation is used.
Why JVM is able to recognize that the function can be optimized? Because it's hardcoded somewhere in the JDK, that has nothing to do with similarity to the code.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With