Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Android MediaCodec: How many simultaneous (video) decoding threads are supported on multiple SurfaceViews?

Tags:

From Grafika project, file DoubleDecodeActivity.java. I tried 3 simultaneous video(h264) decoders using MediaCodec APIs on 3 SurfaceViews. On adding 4th decoder to 4th SurfaceView to Nexus 7 with Android 5.1 CRASHES, So how many simultaneous decoders would be possible or supported.

PS. After this crash, MediaCodec doesn't work anymore. Need to restart the device to use MediaCodec.

Below is the crash log. Crashes at decoder.start() function for 4th decoder thread.

com.example.app.one V/DecodeActivity: Mime: video/avc
com.example.app.one I/OMXClient: Using client-side OMX mux.
com.example.app.one V/DecodeActivity: Mime: video/avc
com.example.app.one I/OMXClient: Using client-side OMX mux.
com.example.app.one V/DecodeActivity: Mime: video/avc
com.example.app.one E/ACodec: [OMX.qcom.video.decoder.avc] storeMetaDataInBuffers failed w/ err -2147483648
com.example.app.one E/ACodec: [OMX.qcom.video.decoder.avc] storeMetaDataInBuffers failed w/ err -2147483648
com.example.app.one W/ACodec: do not know color format 0x7fa30c03 = 2141391875
com.example.app.one W/ACodec: do not know color format 0x7fa30c03 = 2141391875
com.example.app.one I/OMXClient: Using client-side OMX mux.
com.example.app.one V/DecodeActivity: Mime: video/avc
com.example.app.one I/OMXClient: Using client-side OMX mux.
com.example.app.one E/ACodec: [OMX.qcom.video.decoder.avc] storeMetaDataInBuffers failed w/ err -2147483648
com.example.app.one E/ACodec: [OMX.qcom.video.decoder.avc] storeMetaDataInBuffers failed w/ err -2147483648
com.example.app.one W/ACodec: do not know color format 0x7fa30c03 = 2141391875
com.example.app.one W/ACodec: do not know color format 0x7fa30c03 = 2141391875
com.example.app.one E/ACodec: registering GraphicBuffer 9 with OMX IL component failed: -2147483648
com.example.app.one V/PlayerFromFileThread: inputBuffer not available.
com.example.app.one E/ACodec: Failed to allocate buffers after transitioning to IDLE state (error 0x80000000)
com.example.app.one E/ACodec: signalError(omxError 0x80001001, internalError -2147483648)
com.example.app.one V/PlayerFromFileThread: inputBuffer not available.
com.example.app.one E/MediaCodec: Codec reported err 0x80001001, actionCode 0, while in state 5
? E/ACodec: registering GraphicBuffer 4 with OMX IL component failed: -2147483648
? E/AndroidRuntime: FATAL EXCEPTION: Thread-485
                                                   Process: com.example.app.one, PID: 17143
                                                   android.media.MediaCodec$CodecException: start failed
                                                       at android.media.MediaCodec.native_start(Native Method)
                                                       at android.media.MediaCodec.start(MediaCodec.java:612)
                                                       at com.example.app.one.MainActivity$PlayerFromFileThread.run(MainActivity.java:1921)
? E/ACodec: Failed to allocate buffers after transitioning to IDLE state (error 0x80000000)
? E/ACodec: signalError(omxError 0x80001001, internalError -2147483648)
? E/MediaCodec: Codec reported err 0x80001001, actionCode 0, while in state 5
like image 593
Arpan Avatar asked Apr 01 '16 12:04

Arpan


People also ask

What is MediaCodec in Android?

MediaCodec class can be used to access low-level media codecs, i.e. encoder/decoder components. It is part of the Android low-level multimedia support infrastructure (normally used together with MediaExtractor , MediaSync , MediaMuxer , MediaCrypto , MediaDrm , Image , Surface , and AudioTrack .)

How does MediaCodec work?

It processes data asynchronously and uses a set of input and output buffers. At a simplistic level, you request (or receive) an empty input buffer, fill it up with data and send it to the codec for processing. The codec uses up the data and transforms it into one of its empty output buffers.


1 Answers

This is somewhat poorly defined.

In API 23 the MediaCodecInfo getMaxSupportedInstances() method was added, which boldly claims, "This is a hint for an upper bound."

The trouble with defining this value firmly is that the number of hardware instances may be limited by bandwidth requirements, rather than a fixed value. So you might be able to decode two 720p streams but only one 1080p stream.

On many devices, if the hardware is unable to support your request, OMX will switch to a software decoder, e.g. one of the older Nexus devices would let you decode two streams with the hardware codec and then start handing out software codec instances.

The getMaxSupportedInstances() call was an attempt to provide additional information, but as far as I can tell there's still some amount of per-device trial and error required to determine what exactly a device can do.

like image 161
fadden Avatar answered Oct 11 '22 05:10

fadden