Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to play a webrtc.AudioTrack on android (No Video)

I'm trying to use the native webrtc SDK (libjingle) for android. So far i can send streams from android to web (or other platforms) just fine. i can also receive the MediaStream from a peer. (to the onAddStream callback)

The project I'm working on is requiring only audio streams. no video tracks are being created nor sent to anyone.

My question is, how do i play the MediaStream Object that i get from remote peers?

@Override
public void onAddStream(MediaStream mediaStream) {
    Log.d(TAG, "onAddStream: got remote stream");
    // Need to play the audio ///
}

Again, the question is about audio. I'm not using video. apparently all the native webrtc examples uses video tracks, so i had no luck finding any documentation or examples on the web.

Thanks in advance!

like image 823
Sagi Dayan Avatar asked Jul 25 '16 12:07

Sagi Dayan


1 Answers

We can get the Remote Audio Track using below code

import org.webrtc.AudioTrack;

@Override
public void onAddStream(final MediaStream stream){
    if(stream.audioTracks.size() > 0) {
        remoteAudioTrack = stream.audioTracks.get(0);
    }
}

Apparently all the native webrtc examples uses video tracks, so i had no luck finding any documentation or examples on the web.

Yes, as an app developer we have to take care only video rendering. If we have received Remote Audio Track, by default it will play in default speaker(ear speaker/loud speaker/wired headset) based proximity settings.

Check below code in AppRTCAudioManager.java to enable/disable speaker

/** Sets the speaker phone mode. */
private void setSpeakerphoneOn(boolean on) {
    boolean wasOn = audioManager.isSpeakerphoneOn();
    if (wasOn == on) {
      return;
    }
    audioManager.setSpeakerphoneOn(on);
}

Reference Source: AppRTCAudioManager.java

like image 58
Ajay Avatar answered Nov 07 '22 05:11

Ajay