This ffmpeg example demonstrates how to do hardware decoding: https://github.com/FFmpeg/FFmpeg/blob/release/4.2/doc/examples/hw_decode.c
At line 109 it does this:
/* retrieve data from GPU to CPU */
if ((ret = av_hwframe_transfer_data(sw_frame, frame, 0)) < 0) {
I want to avoid this because it takes time. Therefore, I need a way to reuse that decoded video, which is in GPU memory, to redo color conversion.
How to transform the decoded texture in GPU memory to texture in Open GL without getting it back to CPU memory like in the code above?
If the above is not possible, how to do color conversion in the decoded video using open gl? I heard that ffmpeg supports passing opengl shaders as input, so I guess it's possible.
For short: it's complicated.
Depending on the hardware backend that ffmpeg uses on your system you might need to do DirectX/CUDA/OpenGL interop. Let's assume that you're using the VDPAU backend and want to interop it with OpenGL. It looks like ffmpeg does not expose that functionality from its public interface in a documented way.
However, based on the vdpau_transfer_data_from implementation, it seems that you can retrieve the VdpVideoSurface
from the AVFrame
as follows:
VdpVideoSurface surf = (VdpVideoSurface)(uintptr_t)frame->data[3];
From here you can pass that to VDPAURegisterVideoSurfaceNV
from the NV_vdpau_interop and NV_vdpau_interop2 extensions to create OpenGL textures.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With