Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Android - How to add my own Audio codec to AudioRecord?

I currently have a Loop back program for testing Audio on Android devices.

It uses AudioRecord and AudioTrack to record PCM audio from the Mic and play PCM audio out the earpiece.

Here is the code:

public class Record extends Thread
  {

          static final int bufferSize = 200000;
          final short[] buffer = new short[bufferSize];
          short[] readBuffer = new short[bufferSize];

          public void run() {  
            isRecording = true;
            android.os.Process.setThreadPriority
            (android.os.Process.THREAD_PRIORITY_URGENT_AUDIO);

            int buffersize = AudioRecord.getMinBufferSize(11025,
            AudioFormat.CHANNEL_CONFIGURATION_MONO,
            AudioFormat.ENCODING_PCM_16BIT);

                           arec = new AudioRecord(MediaRecorder.AudioSource.MIC,
                                           11025,
                                           AudioFormat.CHANNEL_CONFIGURATION_MONO,
                                           AudioFormat.ENCODING_PCM_16BIT,
                                           buffersize);

                           atrack = new AudioTrack(AudioManager.STREAM_VOICE_CALL,
                                           11025,
                                           AudioFormat.CHANNEL_CONFIGURATION_MONO,
                                           AudioFormat.ENCODING_PCM_16BIT,
                                           buffersize,
                                           AudioTrack.MODE_STREAM);


                           atrack.setPlaybackRate(11025);

                           byte[] buffer = new byte[buffersize];
                           arec.startRecording();
                           atrack.play();

                           while(isRecording) {

                                   arec.read(buffer, 0, buffersize);
                                   atrack.write(buffer, 0, buffer.length);
                           }  
          }
  }

So as you can see in the creation of the AudioTrack and AudioRecord the Encoding is supplied via the AudioFormat but this only allows 16 bit or 8 bit PCM.

I have my own G711 Codec implementation now and I want to be able to encode the audio from the Mic and decode it going into the EarPiece, So I have encode(short lin[], int offset, byte enc[], int frames) and decode(byte enc[], short lin[], int frames) methods but I'm unsure as to how to use them to encode and the decode the audio from the AudioRecord and AudioTrack.

Can anyone help me or point me in the right direction?

like image 679
Donal Rafferty Avatar asked Nov 05 '22 15:11

Donal Rafferty


1 Answers

Change your arec.read(buffer, 0, buffersize) call to use the Bytebuffer read() method from AudioRecord.

Once you have your bytes into the ByteBuffer object, then you can just insert your G711 implementation call of encode and use the ByteBuffer.asShortBuffer() method to get your captured PCM data into the encoder.

That would solve your initial question without having to introduce a third party library to do that work for you. (This answer is for future people that come across the question).

My question is why?

In your code above you capture PCM data from the microphone, and write it directly to the buffer for playback.

It doesn't make any sense in your implementation to follow the path of PCM -> G711 (encode) -> G711 (decode) -> PCM. All you are doing is introducing unnecessary processing and latency. Now, if you were going to write encoded data to a file instead of trying to play it through the ear piece that would be a different story but your current code doesn't really seem useful to encode the PCM data.

Introducing your own codec here would only make sense in the context of writing the compressed voice data to a file (recording call data for example in a compressed manner) or sending it over the network or something.

like image 184
CraneStyle Avatar answered Nov 14 '22 20:11

CraneStyle