Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Integer calculations on GPU

Tags:

For my work it's particularly interesting to do integer calculations, which obviously are not what GPUs were made for. My question is: Do modern GPUs support efficient integer operations? I realize this should be easy to figure out for myself, but I find conflicting answers (for example yes vs no), so I thought it best to ask.

Also, are there any libraries/techniques for arbitrary precision integers on GPUs?

like image 532
gspr Avatar asked Dec 06 '10 09:12

gspr


1 Answers

First, you need to consider the hardware you're using: GPU devices performance widely differs from a constructor to another.
Second, it also depends on the operations considered: for example adds might be faster than multiplies.

In my case, I'm only using NVIDIA devices. For this kind of hardware: the official documentation announces equivalent performance for both 32-bit integers and 32-bit single precision floats with the new architecture (Fermi). Previous architecture (Tesla) used to offer equivalent performance for 32-bit integers and floats but only when considering adds and logical operations.

But once again, this may not be true depending on the device and instructions you use.

like image 126
jopasserat Avatar answered Oct 02 '22 18:10

jopasserat