An argument in favor of JITed languages such as C# and Java is that they can perform optimizations better since runtime profiling by the virtual machine can optimize code better than statically optimized code of C++.
However, I was wondering if we could also use a virtual machine to optimize code at runtime for C++, or rather any similar language. For example, we could take the IR generated by the LLVM compiler and make a virtual machine that interprets, JIT and optimize code, similarly as in case of Java and C#.
Ofcourse, there would be no garbage collection, but the optimization factor would be there. Has anyone worked on this. Are there any papers, tools on this? How good will this approach be?
On top of that, even when you're programming in C, there is a C virtual machine! It is traditionally referred to as "the C runtime", or CRT for short.
"Virtual machine" in this context refers to the computation model of the language. In C, that model is essentially a single-CPU machine with a big memory space that can be accessed with pointers (including a possibly disjoint memory space for program code).
CPU virtualization overhead usually translates into a reduction in overall performance. For applications that are not CPU-bound, CPU virtualization likely translates into an increase in CPU use.
This is a flawed argument. Yes, virtual machines have more information to work with- but they also have vastly less time and space compared to compilers.
Also, yes, absolutely you can do it if you really want to. But nobody does, so it generally isn't happening. At least, not for optimization reasons, you might do it for sandboxing.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With