Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Vector additions on GPU in golang

Tags:

go

gpu

I'm writing a application that requires additions of 5000 length float vectors many times a second. Is it possible to make the GPU perform the calculations, an how would that be done? i need it to run on both windows and linux (later a raspberry pi), so CUDA is out of the question as i don't have a Nvidia graphics card.

like image 738
Pownyan Avatar asked Jan 01 '17 16:01

Pownyan


1 Answers

You can't directly talk to Nvidia GPUs from Go. You'd need to use cgo to call C library from Go. See slide #8 in this presentation for one example (also see full talk).

There're some Go packages that wrap cgo part I mentioned above, into Go a library. mumax is one such package.

like image 96
Dhananjay Avatar answered Sep 21 '22 22:09

Dhananjay