Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Accuracy of GPU for scientific computing

Tags:

gpu

An electrical engineer recently cautioned me against using GPUs for scientific computing (e.g. where accuracy really matters) on the basis that there are no hardware safeguards like there are in a CPU. Is this true, and if so how common/substantial is the problem in typical hardware?

like image 304
Ari B. Friedman Avatar asked Aug 24 '12 14:08

Ari B. Friedman


People also ask

Which type of processor is suitable for scientific computing area?

What CPU is best for scientific computing? There are two main choices: Intel Xeon (single or dual socket) and AMD Threadripper Pro / EPYC (which are based on the same technology). For the majority of cases our recommendation is single socket processors like Xeon-W and Threadripper Pro.

How GPUs play an important role in high performance accelerated computing?

GPU computing is the use of a GPU (graphics processing unit) as a co-processor to accelerate CPUs for general-purpose scientific and engineering computing. The GPU accelerates applications running on the CPU by offloading some of the compute-intensive and time consuming portions of the code.

Can we use GPU for faster computation?

A GPU is a processor that is great at handling specialized computations. We can contrast this to the Central Processing Unit(CPU), which is great at handling general computations. CPUs power most of the computations performed on the devices we use daily. GPU can be faster at completing tasks than CPU.

What computations is a GPU good for?

Traditionally, GPUs have been used to accelerate memory-intensive calculations for computer graphics like image rendering and video decoding.


1 Answers

Actually, modern GPUs fit extremely well for scientific computing and many HPC applications are being at least partially ported to run on GPUs, for the sake of performance and energy efficiency. Unlike older GPUs, the modern ones (take NVIDIA's Fermi or Kepler architectures, for example) provide fully standardized IEEE-754 formats, for both single and double precision, so you should be able to use these just like you do on a modern CPU.

like image 121
a_a_t Avatar answered Sep 25 '22 03:09

a_a_t