I am interested in a way how to read GPU temperature (graphics processing unit, main chip of graphic card), by using some video card driver API?
Everyone knows that there two different chip manufacturers (popular ones, at least) - ATI and nVIDIA - so there are two different kinds of drivers to read temperature from. I'm interested in learning how to do it for each different card driver.
Language in question is irrelevant - it could be C/C++, .NET platform, Java, but let's say that .NET is preferred.
Anyone been doing this before?
What Is a Normal GPU Temperature for Gaming? Optimum GPU gaming temperatures range from 65 to 85°C (149° to 185°F) under normal use conditions.
So, it makes sense that their safe temperature limits vary as well. Because of this, it's hard to say what's a safe temp for all cards. Regardless, it has always been a rule of thumb that graphics cards should stay under 80 degrees Celsius or 176 degrees Fahrenheit.
While ideal GPU temperatures are usually between 65° to 85° Celsius (149° to 185° F) under load, AMD GPUs (like the Radeon RX 5700 or 6000 Series) can safely reach temperatures as high as 110 degrees Celsius (230° F).
For nVidia you would use nvcpl.dll
.
Here's the documentation:
http://developer.download.nvidia.com/SDK/9.5/Samples/DEMOS/common/src/NvCpl/docs/NVControlPanel_API.pdf
I found this: AMD Display Library SDK (ADL for short). That covers ATI cards.
http://developer.amd.com/display-library-adl-sdk/
Link to the original page, via Wayback Machine:
http://web.archive.org/web/20101103020811/http://developer.amd.com/gpu/adlsdk/Pages/default.aspx
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With