Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Video compression on android using new MediaCodec Library

Tags:

In my app I'm trying to upload some videos that the user picked from gallery. The problem is that usually the android video files are too big to upload and so- we want to compress them first by lower bitrate/ resolution.

I've just heard about the new MediaCodec api that introduce with API 16 (I perviously tried to do so with ffmpeg).

What I'm doing right now is the following: First decode the input video using a video decoder, and configure it with the format that was read from the input file. Next, I create a standard video encoder with some predefined parameters, and use it for encoding the decoder output buffer. Then I save the encoder output buffer to a file.

Everything looks good - the same number of packets are written and read from each input and output buffer, but the final file doesn't look like a video file and can't be opened by any video player.

Looks like the decoding is ok, because I test it by displaying it on Surface. I first configure the decoder to work with a Surface, and when we call releaseOutputBuffer we use the render flag, and we're able to see the video on the screen.

Here is the code I'm using:

    //init decoder     MediaCodec decoder = MediaCodec.createDecoderByType(mime);     decoder.configure(format, null , null , 0);     decoder.start();     ByteBuffer[] codecInputBuffers = decoder.getInputBuffers();     ByteBuffer[] codecOutputBuffers = decoder.getOutputBuffers();      //init encoder     MediaCodec encoder = MediaCodec.createEncoderByType(mime);     int width = format.getInteger(MediaFormat.KEY_WIDTH);     int height = format.getInteger(MediaFormat.KEY_HEIGHT);     MediaFormat mediaFormat = MediaFormat.createVideoFormat(mime, width, height);     mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, 400000);     mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE, 25);     mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT, MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420SemiPlanar);     mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 5);     encoder.configure(mediaFormat, null , null , MediaCodec.CONFIGURE_FLAG_ENCODE);     encoder.start();     ByteBuffer[] encoderInputBuffers = encoder.getInputBuffers();     ByteBuffer[] encoderOutputBuffers = encoder.getOutputBuffers();      extractor.selectTrack(0);      boolean sawInputEOS = false;     boolean sawOutputEOS = false;     boolean sawOutputEOS2 = false;     MediaCodec.BufferInfo info = new MediaCodec.BufferInfo();     BufferInfo encoderInfo = new MediaCodec.BufferInfo();      while (!sawInputEOS || !sawOutputEOS || !sawOutputEOS2) {         if (!sawInputEOS) {             sawInputEOS = decodeInput(extractor, decoder, codecInputBuffers);         }          if (!sawOutputEOS) {             int outputBufIndex = decoder.dequeueOutputBuffer(info, 0);             if (outputBufIndex >= 0) {                 sawOutputEOS = decodeEncode(extractor, decoder, encoder, codecOutputBuffers, encoderInputBuffers, info, outputBufIndex);             } else if (outputBufIndex == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {                 Log.d(LOG_TAG, "decoding INFO_OUTPUT_BUFFERS_CHANGED");                 codecOutputBuffers = decoder.getOutputBuffers();             } else if (outputBufIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {                 final MediaFormat oformat = decoder.getOutputFormat();                 Log.d(LOG_TAG, "decoding Output format has changed to " + oformat);             } else if (outputBufIndex == MediaCodec.INFO_TRY_AGAIN_LATER) {                  Log.d(LOG_TAG, "decoding dequeueOutputBuffer timed out!");             }         }          if (!sawOutputEOS2) {             int encodingOutputBufferIndex = encoder.dequeueOutputBuffer(encoderInfo, 0);             if (encodingOutputBufferIndex >= 0) {                 sawOutputEOS2 = encodeOuput(outputStream, encoder, encoderOutputBuffers, encoderInfo, encodingOutputBufferIndex);             } else if (encodingOutputBufferIndex == MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED) {                 Log.d(LOG_TAG, "encoding INFO_OUTPUT_BUFFERS_CHANGED");                 encoderOutputBuffers = encoder.getOutputBuffers();             } else if (encodingOutputBufferIndex == MediaCodec.INFO_OUTPUT_FORMAT_CHANGED) {                 final MediaFormat oformat = encoder.getOutputFormat();                 Log.d(LOG_TAG, "encoding Output format has changed to " + oformat);             } else if (encodingOutputBufferIndex == MediaCodec.INFO_TRY_AGAIN_LATER) {                 Log.d(LOG_TAG, "encoding dequeueOutputBuffer timed out!");             }         }     }             //clear some stuff here... 

and those are the method I use for decode/ encode:

    private boolean decodeInput(MediaExtractor extractor, MediaCodec decoder, ByteBuffer[] codecInputBuffers) {         boolean sawInputEOS = false;         int inputBufIndex = decoder.dequeueInputBuffer(0);         if (inputBufIndex >= 0) {             ByteBuffer dstBuf = codecInputBuffers[inputBufIndex];             input1count++;              int sampleSize = extractor.readSampleData(dstBuf, 0);             long presentationTimeUs = 0;             if (sampleSize < 0) {                 sawInputEOS = true;                 sampleSize = 0;                 Log.d(LOG_TAG, "done decoding input: #" + input1count);             } else {                 presentationTimeUs = extractor.getSampleTime();             }              decoder.queueInputBuffer(inputBufIndex, 0, sampleSize, presentationTimeUs, sawInputEOS ? MediaCodec.BUFFER_FLAG_END_OF_STREAM : 0);             if (!sawInputEOS) {                 extractor.advance();             }         }         return sawInputEOS;     }     private boolean decodeOutputToFile(MediaExtractor extractor, MediaCodec decoder, ByteBuffer[] codecOutputBuffers,             MediaCodec.BufferInfo info, int outputBufIndex, OutputStream output) throws IOException {         boolean sawOutputEOS = false;          ByteBuffer buf = codecOutputBuffers[outputBufIndex];         if ((info.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) {             sawOutputEOS = true;             Log.d(LOG_TAG, "done decoding output: #" + output1count);         }          if (info.size > 0) {             output1count++;             byte[] outData = new byte[info.size];             buf.get(outData);             output.write(outData, 0, outData.length);         } else {             Log.d(LOG_TAG, "no data available " + info.size);         }         buf.clear();         decoder.releaseOutputBuffer(outputBufIndex, false);         return sawOutputEOS;     }      private boolean encodeInputFromFile(MediaCodec encoder, ByteBuffer[] encoderInputBuffers, MediaCodec.BufferInfo info, FileChannel channel) throws IOException {             boolean sawInputEOS = false;             int inputBufIndex = encoder.dequeueInputBuffer(0);             if (inputBufIndex >= 0) {                 ByteBuffer dstBuf = encoderInputBuffers[inputBufIndex];                 input1count++;                  int sampleSize = channel.read(dstBuf);                 if (sampleSize < 0) {                     sawInputEOS = true;                     sampleSize = 0;                     Log.d(LOG_TAG, "done encoding input: #" + input1count);                 }                  encoder.queueInputBuffer(inputBufIndex, 0, sampleSize, channel.position(), sawInputEOS ? MediaCodec.BUFFER_FLAG_END_OF_STREAM : 0);             }             return sawInputEOS;     } 

Any suggestion on what I'm doing wrong?

I didn't find too much examples for encoding with MediaCodec just a few samples code for decoding... Thanks a lot for the help

like image 305
shem Avatar asked Apr 11 '13 13:04

shem


People also ask

What is Android MediaCodec?

MediaCodec class can be used to access low-level media codecs, i.e. encoder/decoder components. It is part of the Android low-level multimedia support infrastructure (normally used together with MediaExtractor , MediaSync , MediaMuxer , MediaCrypto , MediaDrm , Image , Surface , and AudioTrack .)

How does MediaCodec work?

It processes data asynchronously and uses a set of input and output buffers. At a simplistic level, you request (or receive) an empty input buffer, fill it up with data and send it to the codec for processing. The codec uses up the data and transforms it into one of its empty output buffers.

How are videos compressed?

Video compression tools work by using video encoding algorithms known as codecs. Codecs, by encoding video sequences, reduce the number of bits needed to represent a video. After encoding, the video sequences are wrapped in video containers (such as MP4 and AVI) and then output as compressed videos.


1 Answers

The output of MediaCodec is a raw elementary stream. You need to package it up into a video file format (possibly muxing the audio back in) before many players will recognize it. FWIW, I've found that the GStreamer-based Totem Movie Player for Linux will play "raw" video/avc files.

Update: The way to convert H.264 to .mp4 on Android is with the MediaMuxer class, introduced in Android 4.3 (API 18). There are a couple of examples (EncodeAndMuxTest, CameraToMpegTest) that demonstrate its use.

like image 193
fadden Avatar answered Oct 08 '22 00:10

fadden