Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Capture video from camera on Raspberry Pi and filter in OpenGL before encoding

I would like a way to capture video from the camera interface in Raspberry Pi, run it through a filter written as OpenGL shaders, and then send it to the hardware encoder.

This blog post talks about applying OpenGL shader filters onto the output of the camera when using raspistill. This is the corresponding source code. The output in that case however does not go to the video encoder, and this is not running on video, only on stills. Also (not completely sure) I think this ties into the preview, see these bits: raspitex_state A pointer to the GL preview state and state->ops.redraw = sobel_redraw.

The blog also talks about "fastpath", can someone explan what that means in this context?

like image 775
Alex I Avatar asked Dec 12 '13 09:12

Alex I


1 Answers

The texture conversion will work on any MMAL opaque buffer i.e. camera preview, still (up to 2000x2000 resolution), video. However, the example code only does the GL plumbing for stills preview. I think someone posted a patch on the RPI forums to make it work with RaspiVid so you might be able to use that.

Fastpath basically means not copying the buffer data to ARM memory and doing a software conversion. So, for the GL rendering it means just passing a handle to GL so the GPU driver can do this directly.

Currently, there is no support/fastpath in the drivers for feeding the OpenGL rendered buffers into the video encoder. Instead, the slow and probably impractical path is to call glReadPixels, convert the buffer to YUV and pass the converted buffer to the encoder.

A fastpath is certainly possible and I've done some work in porting this to the RPI drivers but there's some other framework required and I won't get chance to look at this until the New Year.

like image 58
Tim Gover Avatar answered Oct 01 '22 22:10

Tim Gover