Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Does MediaCodec always give 16-bit audio output?

I'm using Android's MediaCodec class to read raw data from audio files. That works just fine.

The problem is that I don't know if it's safe to assume that the output data will always be 16-bit?

I can tell, experimentally, that the output is 16-bit, but I don't know how to check that at runtime. The MediaCodec documentation doesn't appear to tell me. The MediaFormat KEY_CHANNEL_MASK could tell me, but MediaCodec doesn't appear to set those flags. It sets the sample rate, and the mime-type, but nothing that can tell me the bit-size explicitly.

I suppose that given the difference between presentation times of subsequent blocks, and the sample rate, I should be able to calculate it, but that doesn't seem very satisfactory.

Is there a way to tell, or is it written somewhere that I don't have to?

like image 660
ams Avatar asked May 07 '14 22:05

ams


1 Answers

Currently the output is always 16 bit in stock Android. If that changes in the future we'll add an additional format key that specifies the format. Note that KEY_CHANNEL_MASK would only tell you which channels are included (e.g. left, right, center, etc), not the sample format.

like image 197
marcone Avatar answered Sep 19 '22 03:09

marcone