I am trying to encode aac audio using android AudioRecord and MediaCodec. I have created a encoder class very similar to (Encoding H.264 from camera with Android MediaCodec). With this class, I created an instance of AudioRecord and tell it to read off its byte[] data to the AudioEncoder (audioEncoder.offerEncoder(Data)).
while(isRecording)
{
audioRecord.read(Data, 0, Data.length);
audioEncoder.offerEncoder(Data);
}
Here is my Setting for my AudioRecord
int audioSource = MediaRecorder.AudioSource.MIC;
int sampleRateInHz = 44100;
int channelConfig = AudioFormat.CHANNEL_IN_MONO;
int audioFormat = AudioFormat.ENCODING_PCM_16BIT;
int bufferSizeInBytes = AudioRecord.getMinBufferSize(sampleRateInHz, channelConfig, audioFormat);
I successfully collected some byte[] array data and written it to a local file. Unfortunately the file is not playable. I did some more searching online and found a related post (How to generate the AAC ADTS elementary stream with Android MediaCodec). So, others who are having similar problem are saying the main problem is "The MediaCodec encoder generates the raw AAC stream. The raw AAC stream needs to be converted into a playable format, such as the ADTS stream". So I tried to add the ADTS header. Nevertheless, after I added the ADTS header(I commented out in the code below), my AudioEncoder wouldn't even write the output audio file. Is there anything I'm missing? Is my setup correct?
Any suggestions, comments, and opinions are welcome and very appreciated. thanks guys!
import android.media.MediaCodec;
import android.media.MediaCodecInfo;
import android.media.MediaFormat;
import android.os.Environment;
import android.util.Log;
import java.io.BufferedOutputStream;
import java.io.File;
import java.io.FileOutputStream;
import java.io.IOException;
import java.nio.ByteBuffer;
public class AudioEncoder {
private MediaCodec mediaCodec;
private BufferedOutputStream outputStream;
private String mediaType = "audio/mp4a-latm";
public AudioEncoder() {
File f = new File(Environment.getExternalStorageDirectory(), "Download/audio_encoded.aac");
touch(f);
try {
outputStream = new BufferedOutputStream(new FileOutputStream(f));
Log.e("AudioEncoder", "outputStream initialized");
} catch (Exception e){
e.printStackTrace();
}
mediaCodec = MediaCodec.createEncoderByType(mediaType);
final int kSampleRates[] = { 8000, 11025, 22050, 44100, 48000 };
final int kBitRates[] = { 64000, 128000 };
MediaFormat mediaFormat = MediaFormat.createAudioFormat(mediaType,kSampleRates[3],1);
mediaFormat.setInteger(MediaFormat.KEY_AAC_PROFILE, MediaCodecInfo.CodecProfileLevel.AACObjectLC);
mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, kBitRates[1]);
mediaCodec.configure(mediaFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);
mediaCodec.start();
}
public void close() {
try {
mediaCodec.stop();
mediaCodec.release();
outputStream.flush();
outputStream.close();
} catch (Exception e){
e.printStackTrace();
}
}
// called AudioRecord's read
public synchronized void offerEncoder(byte[] input) {
Log.e("AudioEncoder", input.length + " is coming");
try {
ByteBuffer[] inputBuffers = mediaCodec.getInputBuffers();
ByteBuffer[] outputBuffers = mediaCodec.getOutputBuffers();
int inputBufferIndex = mediaCodec.dequeueInputBuffer(-1);
if (inputBufferIndex >= 0) {
ByteBuffer inputBuffer = inputBuffers[inputBufferIndex];
inputBuffer.clear();
inputBuffer.put(input);
mediaCodec.queueInputBuffer(inputBufferIndex, 0, input.length, 0, 0);
}
MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo();
int outputBufferIndex = mediaCodec.dequeueOutputBuffer(bufferInfo,0);
////trying to add a ADTS
// while (outputBufferIndex >= 0) {
// int outBitsSize = bufferInfo.size;
// int outPacketSize = outBitsSize + 7; // 7 is ADTS size
// ByteBuffer outputBuffer = outputBuffers[outputBufferIndex];
//
// outputBuffer.position(bufferInfo.offset);
// outputBuffer.limit(bufferInfo.offset + outBitsSize);
//
// byte[] outData = new byte[outPacketSize];
// addADTStoPacket(outData, outPacketSize);
//
// outputBuffer.get(outData, 7, outBitsSize);
// outputBuffer.position(bufferInfo.offset);
//
//// byte[] outData = new byte[bufferInfo.size];
// outputStream.write(outData, 0, outData.length);
// Log.e("AudioEncoder", outData.length + " bytes written");
//
// mediaCodec.releaseOutputBuffer(outputBufferIndex, false);
// outputBufferIndex = mediaCodec.dequeueOutputBuffer(bufferInfo, 0);
//
// }
//Without ADTS header
while (outputBufferIndex >= 0) {
ByteBuffer outputBuffer = outputBuffers[outputBufferIndex];
byte[] outData = new byte[bufferInfo.size];
outputBuffer.get(outData);
outputStream.write(outData, 0, outData.length);
Log.e("AudioEncoder", outData.length + " bytes written");
mediaCodec.releaseOutputBuffer(outputBufferIndex, false);
outputBufferIndex = mediaCodec.dequeueOutputBuffer(bufferInfo, 0);
}
} catch (Throwable t) {
t.printStackTrace();
}
}
/**
* Add ADTS header at the beginning of each and every AAC packet.
* This is needed as MediaCodec encoder generates a packet of raw
* AAC data.
*
* Note the packetLen must count in the ADTS header itself.
**/
private void addADTStoPacket(byte[] packet, int packetLen) {
int profile = 2; //AAC LC
//39=MediaCodecInfo.CodecProfileLevel.AACObjectELD;
int freqIdx = 4; //44.1KHz
int chanCfg = 2; //CPE
// fill in ADTS data
packet[0] = (byte)0xFF;
packet[1] = (byte)0xF9;
packet[2] = (byte)(((profile-1)<<6) + (freqIdx<<2) +(chanCfg>>2));
packet[3] = (byte)(((chanCfg&3)<<6) + (packetLen>>11));
packet[4] = (byte)((packetLen&0x7FF) >> 3);
packet[5] = (byte)(((packetLen&7)<<5) + 0x1F);
packet[6] = (byte)0xFC;
}
public void touch(File f)
{
try {
if(!f.exists())
f.createNewFile();
} catch (IOException e) {
e.printStackTrace();
}
}
}
You can use Android's MediaMuxer to package the raw streams created by MediaCodec into a .mp4 file. Bonus: AAC packets contained in a .mp4 don't require the ADTS header.
I've got a working example of this technique on Github.
Check "testEncoder" method here for how to use MediaCodec as Encoder properly.
after that In your code,
your input(audio recorder) is configured for single audio channel while your output(ADTS packet header) is set for two channels(chanCfg = 2).
also if you change your input samplerate (currently 44.1khz) you also have to change freqIdx flag in ADTS packet header. check this link for valid values.
And ADTS header profile flag is set to "AAC LC", you can also found this under MediaCodecInfo.CodecProfileLevel. you have set profile = 2 that is MediaCodecInfo.CodecProfileLevel.AACObjectLC
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With