Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Encoding FBO textures to H.264 video directly on GPU

I am planning to write an app in which I need an ability to encode video from a given image sequence that comes from an FBO attached texture.I want to leverage the strength of GPU parallelism and do it all maximum "server side".So I found NVidia has an SDK which encodes video using CUDA , but after reading the SDK white paper it is still not clear to me if it's possible to send an FBO texture as the image frame source for the NVCUVENC (encoder) without leaving GPU.I believe that uploading images from CPU adds huge overhead to the encoding process because if I can't fetch FBO texture directly on GPU side , it means I have to read its pixels to CPU and then send them again to GPU for encoding. So I have basically 2 questions:

  1. Do NVIDIA CODEC Libraries allow doing what I want?

  2. If not ,can it be done with other GPGPU SDKs like OpenCL or even OpenGL 4.3 compute shaders?

Anything related to DirectX or other windows related stuff can't be taken into account as I need it for Linux.Also I use NVidia hardware only.

like image 656
Michael IV Avatar asked Oct 22 '22 22:10

Michael IV


1 Answers

Yes, you can definitely transfer OpenGL generated images into the CUDA Codec Library. Have a look at the CUDA API reference. Of most interest for you are Texture Reference Management and Graphics Interoperability. The basic idea is, that CUDA allows you to get a device side memory mapping of a texture object, which you can pass as image source to the encoder.

like image 101
datenwolf Avatar answered Oct 27 '22 11:10

datenwolf