Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Android encoding using MediaCodec and a Surface

I've been rendering video through the MediaCodec directly to a Surface that was taken from a SurfaceView in my UI. This works great.

I am now attempting to use MediaCodec as an encoder. As a test, I want to render to the Surface (as in above) and loopback through a different instance of MediaCodec configured as an encoder.

I see the createInputSurface() method of the encoder. I think I want the encoder to create this surface and then have the decoder MediaCodec use this as the surface to draw to. First off, is this possible?

Secondly, I'm not sure how to create a SurfaceView from the Surface that the encoder creates. I've only extracted a Surface from a SurfaceView and I don't see, from the docs, how to do this in reverse.

like image 417
Mat DePasquale Avatar asked Sep 07 '15 13:09

Mat DePasquale


1 Answers

Surfaces are the "producer" side of a producer-consumer arrangement. Generally speaking, the API is centered around consumers, which create both ends and then hand the producer interface (the Surface) back to you.

So for a SurfaceView or a MediaCodec encoder, you create the object, and get its Surface. Then you send buffers of graphics data to them, with Canvas, OpenGL ES, or a MediaCodec decoder.

There is no way to take the encoder's input Surface and use it as the SurfaceView's display Surface -- they're two different pipelines. The SurfaceView's consumer is in the system compositor (SurfaceFlinger), which is why you have to wait for the "surface created" callback to fire. The MediaCodec encoder's consumer is in the mediaserver process, though the asynchronicity is better concealed.

Sending the MediaCodec decoder output to a SurfaceView is straightforward, as is sending the output to a MediaCodec encoder. As you surmised, just pass the encoder's input Surface to the decoder. Where life gets interesting is when you want to do both of those things at the same time.

The code underlying Surface (called BufferQueue) should be capable (as of Lollipop) of multiplexing, but I'm not aware of an API in Lollipop that exposes the capability to applications. Which means you're stuck doing things the hard way.

The hard way involves creating a SurfaceTexture (a/k/a GLConsumer), which is the consumer end of the pipe. From that you can create a Surface, using the sole constructor. You hand that to the MediaCodec decoder. Now every frame that comes out will be converted to a GLES texture by SurfaceTexture. You can render those to the SurfaceView and the encoder's input Surface.

You can find various examples in Grafika, and a longer explanation of the mechanics in the graphics architecture doc.

like image 186
fadden Avatar answered Sep 29 '22 11:09

fadden