Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Android MediaCodec slower in async-mode than in synchronous mode?

Again, I have a question as regards Android's MediaCodec class.

I have successfully managed to decode raw h264 content and display the result in two TextureViews. The h264 stream comes from a server that is running an openGL scene.

The scene has a camera and hence is responsive to user input.

To further reduce the latency between an input on the server and the actual result on the smartphone, I was thinking about using MediaCodec in its async mode.

Here is how i set up both variants: synchronous and asynchronous:

Async:

//decoderCodec is "video/avc"
MediaFormat fmt = MediaFormat.createVideoFormat(decoderCodec, 1280,720);
codec.setCallback(new MediaCodec.Callback() {

    @Override
    public void onInputBufferAvailable(MediaCodec codec, int index) {
        byte[] frameData;
        try {
            frameData = frameQueue.take(); //this call is blocking
        } catch (InterruptedException e) {
            return;
        }

        ByteBuffer inputData = codec.getInputBuffer(index);
        inputData.clear();
        inputData.put(frameData);

        codec.queueInputBuffer(index, 0, frameData.length, 0, 0);
    }

    @Override
    public void onOutputBufferAvailable(MediaCodec codec, int index, MediaCodec.BufferInfo info) {
        codec.releaseOutputBuffer(index, true);
    }

     //The two other methods are left blank at the moment.

});


codec.configure(fmt, surface, null, 0);
codec.start();

Sync: (is setup like Async except the codec.setCallback(...) part. The class which both variants reside in is a subclass of Runnable.

public void run() {

    while(!Thread.interrupted())
    {
        if(!IS_ASYNC) {
            byte[] frameData;
            try {
                frameData = frameQueue.take(); //this call is blocking
            } catch (InterruptedException e) {
                break;
            }

            int inIndex = codec.dequeueInputBuffer(BUFFER_TIMEOUT);

            if (inIndex >= 0) {
                ByteBuffer input = codec.getInputBuffer(inIndex);
                input.clear();
                input.put(frameData);
                codec.queueInputBuffer(inIndex, 0, frameData.length, 0, 0);
            }

            MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo();
            int outIndex = codec.dequeueOutputBuffer(bufferInfo, BUFFER_TIMEOUT);

            if(outIndex >= 0)
                codec.releaseOutputBuffer(outIndex, true);
        }
        else sleep(3000); //Just for testing, if we are in Async, this thread has nothing to do actually...
    }
}

Both approaches work, but I'm observing that the videos played in synchronous-mode are much smoother and the latency is also lower.

I came up with the idea of using the async mode because frameQueue is a LinkedBlockingDeque and I reasoned that should the synchronous decoder wait too long for new frame data to arrive, decoded output may already be available but would not be displayed because of the blocking nature of the queue. On the other hand, I don't want to do something like busy wait and poll the queue, the inputBuffers and outputBuffers all the time.

So I tried the AsyncMode using the Callbacks, but the result I get is even worse than in synchronous mode.

The question I have for you guys now is this:

Why? Did I misuse the async mode? or is it something else?

Thanks for any feedback!

Christoph

Edit: The following is the updated code. I am only listing the updated parts. So as @mstorsjo pointed out correctly, the culprit was me waiting for more frame data in onInputBufferAvailable(). The updated version feeds another BlockingQueue with the available buffer indices. In an additional Thread we are waiting for new frame data AND a new buffer index to queue the frame data for decoding.

public class DisplayThread implements Runnable {
    private BlockingQueue<Integer> freeInputBuffers;
    //skipped the uninteresting parts.

    private void initCodec(String decoderCodec) {       
        //skipped the uninteresting parts.
        codec.setCallback(new MediaCodec.Callback() {

            @Override
            public void onInputBufferAvailable(MediaCodec codec, int index) {
                freeInputBuffers.add(index);
            }

            //Dont care about the rest of the Callbacks for this demo...
        }
    }   

    @Override
    public void run() {
        while(!Thread.interrupted())
        {

            byte [] frameData;
            int inputIndex;

            try {
                frameData = frameQueue.take();
                //this was, indeed the culprit. We can wait in an additional thread for an buffer index to 
                // become free AND to get new frameData. When waiting in the callback, we will slow down 
                // the decoder.
                inputIndex = freeInputBuffers.take();
            } catch (InterruptedException e) {
                break;
            }

            ByteBuffer inputData = codec.getInputBuffer(inputIndex);
            inputData.clear();
            inputData.put(frameData);
            codec.queueInputBuffer(inputIndex, 0, frameData.length, 0, 0);      
        }

        codec.stop();
        codec.release();
    }
}
like image 609
Christoph Avatar asked Oct 12 '16 12:10

Christoph


1 Answers

I would not be surprised if the blocking call in onInputBufferAvailable is the culprit. It feels probable that both onInputBufferAvailable and onOutputBufferAvailable are called within the same thread, and if you block in one, you stop the other one from running.

I would suggest changing it so that you in onInputBufferAvailable just push the buffer index onto some queue, and signal a different thread that there's another buffer available now, then having this second thread wait for buffers from the queue, and do a blocking fetch of input data there.

like image 173
mstorsjo Avatar answered Nov 03 '22 18:11

mstorsjo