Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Would it be possible for a JIT compiler to utilize GPU for certain operations behind the scenes?

Feel free to correct me if any part of my understanding is wrong.

My understanding is that GPUs offer a subset of the instructions that a normal CPU provides but executes them much faster.

I know there are ways to utilize GPU cycles for non-graphical purpose, but it seems like (in theory) a language that's Just In Time compiled could detect the presence of a suitable GPU and offload some of the work to the GPU behind the scenes without code change.

Is my understanding naive? Is it just a matter of it's really complicated and just hasn't been done it?

like image 342
Davy8 Avatar asked Jun 30 '10 14:06

Davy8


1 Answers

My understanding is that GPUs offer a subset of the instructions that a normal CPU provides but executes them much faster.

It's definitly not as simple. The GPU is tailored mainly at SIMD/vector processing. So even though the theoretical potential of GPUs nowadays is vastely superior to CPUs, only programs that can benefit from SIMD instructions can be executed efficiently on the GPU. Also, there is of course a performance penalty when data has to be transfered from the CPU to the GPU to be processed there.

So for a JIT compiler to be able to use the GPU efficiently, it must be able to detect code that can be parallelized to benefit from SIMD instructions and then has to determine, if the overhead induced by transfering data from the CPU to the GPU will be outweight by the performance improvements.

like image 121
Janick Bernet Avatar answered Oct 24 '22 18:10

Janick Bernet