Watching Towards a Universal VM presentation, I studied this slide, which lists all the optimisations that HotSpot JIT does:
In the language-specific techniques
section there is a de-reflection. I tried to find some information about it accross the Internet, but failed. I understood that this optimization eliminates reflection costs in some way, but I'm interested in details. Can someone clarify this, or give some useful links?
JIT is "Just In Time" compiling, basically compiling on the fly. Hotspot is the concept within the JVM where it only compiles the code that's actually being used.
The JIT compiler in OpenJDK. A Java-based JIT compiler takes . class files as input rather than Java code, which is consumed by javac . In this way, a JIT compiler differs from a compiler like GCC, which directly consumes the code that you produce.
HotSpot VM defaults to interpreting Java byte code. It compiles (JIT compilation) methods that are executed for a predetermined number of times. JIT compliers are either client or server compilers. Client compiler: It compiles methods quickly but emits machine code that is less optimized than the server compiler.
Yes, there is an optimization to reduce Reflection costs, though it is implemented mostly in Class Library rather than in JVM.
Before Java 1.4 Method.invoke
worked through a JNI call to VM runtime. Each invocation required at least two transitions from Java to Native and back to Java. The VM runtime parsed a method signature, verified that types of passed arguments were correct, performed boxing/unboxing and constructed a new Java frame for a called method. All that was rather slow.
Since Java 1.4 Method.invoke
uses dynamic bytecode generation if a method is called more than 15 times (configurable via sun.reflect.inflationThreshold
system property). A special Java class responsible for calling the given particular method is built in run-time. This class implements sun.reflect.MethodAccessor which java.lang.reflect.Method
delegates calls to.
The approach with dynamic bytecode generation is much faster since it
Note, that this optimization is implemented mostly in Java code without JVM assistance. The only thing HotSpot VM does to make this optimization possible - is skipping bytecode verification for such generated MethodAccessors. Otherwise the verifier would not allow, for example, to call private methods.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With