I've recently read a lot about software (mostly scientific/math and encryption related) that moves part of their calculation onto the GPU which causes a 100-1000 (!) fold increase in speed for supported operations.
Is there a library, API or other way to run something on the GPU via C#? I'm thinking of simple Pi calculation. I have a GeForce 8800 GTX if that's relevant at all (would prefer card independent solution though).
It's a very new technology, but you might investigate CUDA. Since your question is tagged with C#, here is a .Net wrapper.
As a bonus, it appears that your 8800 GTX supports CUDA.
Another option that hasn't been mentioned for GPU calculation from C# is Brahma.
Brahma provides a LINQ-based abstraction for GPU calculations - it's basically LINQ to GPU. It works over OpenGL and DirectX without extra libraries (but requires SM3). Some of the samples are fairly amazing.
You might want to look at this question
You're probably looking for Accelerator, but if you are interested in game development in general I'll suggest you take a look at XNA
You can access the latest Direct3D APIs from .NET using the Windows API Code Pack. Direct3D 11 comes with Compute Shaders. These are roughly comparable to CUDA, but work also on non-NVIDIA GPUs.
Note that Managed DirectX and XNA are limited to the Direct3D 9 feature set, which is somewhat difficult to use for GPGPU.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With