I'm using Android's MediaCodec class to read raw data from audio files. That works just fine.
The problem is that I don't know if it's safe to assume that the output data will always be 16-bit?
I can tell, experimentally, that the output is 16-bit, but I don't know how to check that at runtime. The MediaCodec documentation doesn't appear to tell me. The MediaFormat KEY_CHANNEL_MASK
could tell me, but MediaCodec doesn't appear to set those flags. It sets the sample rate, and the mime-type, but nothing that can tell me the bit-size explicitly.
I suppose that given the difference between presentation times of subsequent blocks, and the sample rate, I should be able to calculate it, but that doesn't seem very satisfactory.
Is there a way to tell, or is it written somewhere that I don't have to?
Currently the output is always 16 bit in stock Android. If that changes in the future we'll add an additional format key that specifies the format. Note that KEY_CHANNEL_MASK
would only tell you which channels are included (e.g. left, right, center, etc), not the sample format.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With