Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

MediaCodec H264 Encoder not working on Snapdragon 800 devices

I have written a H264 Stream Encoder using the MediaCodec API of Android. I tested it on about ten different devices with different processors and it worked on all of them, except on Snapdragon 800 powered ones (Google Nexus 5 and Sony Xperia Z1). On those devices I get the SPS and PPS and the first Keyframe, but after that mEncoder.dequeueOutputBuffer(mBufferInfo, 0) only returns MediaCodec.INFO_TRY_AGAIN_LATER. I already experimented with different timeouts, bitrates, resolutions and other configuration options, to no avail. The result is always the same.

I use the following code to initialise the Encoder:

        mBufferInfo = new MediaCodec.BufferInfo();
        encoder = MediaCodec.createEncoderByType("video/avc");
        MediaFormat mediaFormat = MediaFormat.createVideoFormat("video/avc", 640, 480);
        mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, 768000);
        mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, 30);
        mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, mEncoderColorFormat);
        mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 10);
        encoder.configure(mediaFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);

where the selected color format is:

MediaCodecInfo.CodecCapabilities capabilities = mCodecInfo.getCapabilitiesForType(MIME_TYPE);
            for (int i = 0; i < capabilities.colorFormats.length && selectedColorFormat == 0; i++)
            {
                int format = capabilities.colorFormats[i];
                switch (format) {
                    case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Planar:
                    case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420PackedPlanar:
                    case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420SemiPlanar:
                    case MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420PackedSemiPlanar:
                    case MediaCodecInfo.CodecCapabilities.COLOR_TI_FormatYUV420PackedSemiPlanar:
                    case MediaCodecInfo.CodecCapabilities.COLOR_QCOM_FormatYUV420SemiPlanar:
                        selectedColorFormat = format;
                        break;
                    default:
                        LogHandler.e(LOG_TAG, "Unsupported color format " + format);
                        break;
                }
            }

And I get the data by doing

            ByteBuffer[] inputBuffers = mEncoder.getInputBuffers();
        ByteBuffer[] outputBuffers = mEncoder.getOutputBuffers();

        int inputBufferIndex = mEncoder.dequeueInputBuffer(-1);
        if (inputBufferIndex >= 0)
        {
            // fill inputBuffers[inputBufferIndex] with valid data
            ByteBuffer inputBuffer = inputBuffers[inputBufferIndex];
            inputBuffer.clear();
            inputBuffer.put(rawFrame);
            mEncoder.queueInputBuffer(inputBufferIndex, 0, rawFrame.length, 0, 0);
            LogHandler.e(LOG_TAG, "Queue Buffer in " + inputBufferIndex);
        }

        while(true)
        {
            int outputBufferIndex = mEncoder.dequeueOutputBuffer(mBufferInfo, 0);
            if (outputBufferIndex >= 0)
            {
                Log.d(LOG_TAG, "Queue Buffer out " + outputBufferIndex);
                ByteBuffer buffer = outputBuffers[outputBufferIndex];
                if ((mBufferInfo.flags & MediaCodec.BUFFER_FLAG_CODEC_CONFIG) != 0)
                {
                    // Config Bytes means SPS and PPS
                    Log.d(LOG_TAG, "Got config bytes");
                }

                if ((mBufferInfo.flags & MediaCodec.BUFFER_FLAG_SYNC_FRAME) != 0)
                {
                    // Marks a Keyframe
                    Log.d(LOG_TAG, "Got Sync Frame");
                }

                if (mBufferInfo.size != 0)
                {
                    // adjust the ByteBuffer values to match BufferInfo (not needed?)
                    buffer.position(mBufferInfo.offset);
                    buffer.limit(mBufferInfo.offset + mBufferInfo.size);

                    int nalUnitLength = 0;
                    while((nalUnitLength = parseNextNalUnit(buffer)) != 0)
                    {
                        switch(mVideoData[0] & 0x0f)
                        {
                            // SPS
                            case 0x07:
                            {
                                Log.d(LOG_TAG, "Got SPS");
                                break;
                            }

                            // PPS
                            case 0x08:
                            {
                                Log.d(LOG_TAG, "Got PPS");
                                break;
                            }

                            // Key Frame
                            case 0x05:
                            {
                                Log.d(LOG_TAG, "Got Keyframe");
                            }

                            //$FALL-THROUGH$
                            default:
                            {
                                // Process Data
                                break;
                            }
                        }
                    }
                }

                mEncoder.releaseOutputBuffer(outputBufferIndex, false);

                if ((mBufferInfo.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0)
                {
                    // Stream is marked as done,
                    // break out of while
                    Log.d(LOG_TAG, "Marked EOS");
                    break;
                }
            }
            else if(outputBufferIndex == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED)
            {
                outputBuffers = mEncoder.getOutputBuffers();
                Log.d(LOG_TAG, "Output Buffer changed " + outputBuffers);
            }
            else if(outputBufferIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED)
            {
                MediaFormat newFormat = mEncoder.getOutputFormat();
                Log.d(LOG_TAG, "Media Format Changed " + newFormat);
            }
            else if(outputBufferIndex == MediaCodec.INFO_TRY_AGAIN_LATER)
            {
                // No Data, break out
                break;
            }
            else
            {
                // Unexpected State, ignore it
                Log.d(LOG_TAG, "Unexpected State " + outputBufferIndex);
            }
        }

Thanks for your help!

like image 537
lowtraxx Avatar asked Dec 09 '13 16:12

lowtraxx


1 Answers

You need to set the presentationTimeUs parameter in your call to queueInputBuffer. Most encoders ignore this and you can encode for streaming without issues. The encoder used for Snapdragon 800 devices doesn't.

This parameter represents the recording time of your frame and needs therefore to increase by the number of us between the frame that you want to encode and the previous frame.

If the parameter set is the same value as in the previous frame the encoder drops it. If the parameter is set to a too small value (e.g. 100000 on a 30 FPS recording) the quality of the encoded frames drops.

like image 199
Markus S Avatar answered Oct 07 '22 18:10

Markus S