I'm trying to figure out the best approach to display a video on a GL texture while preserving the transparency of the alpha channel.
Information about video as GL texture is here: Is it possible using video as texture for GL in iOS? and iOS4: how do I use video file as an OpenGL texture?.
Using ffmpeg to help with alpha transparency, but not app store friendly is here: iPhone: Display a semi-transparent video on top of a UIView?
The video source would be filmed in front of a green screen for chroma keying. The video could be untouched to leave the green screen or processed in a video editing suite and exported to Quicktime Animation or Apple Pro Res 4444 with Alpha.
There are multiple approaches that I think could potentially work, but I haven't found a full solution.
I would love to get your thoughts on the best approach for iOS and OpenGL ES 2.0
Thanks.
The easiest way to do chroma keying for simple blending of a movie and another scene would be to use the GPUImageChromaKeyBlendFilter from my GPUImage framework. You can supply the movie source as a GPUImageMovie, and then blend that with your background content. The chroma key filter allows you to specify a color, a proximity to that color, and a smoothness of blending to use in the replacement operation. All of this is GPU-accelerated via tuned shaders.
Images, movies, and the live cameras can be used as sources, but if you wish to render this with OpenGL ES content behind your movie, I'd recommend rendering your OpenGL ES content to a texture-backed FBO and passing that texture in via a GPUImageTextureInput.
You could possibly use this to output a texture containing your movie frames with the keyed color replaced by a constant color with a 0 alpha channel, as well. This texture could be extracted using a GPUImageTextureOutput for later use in your OpenGL ES scene.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With