Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

how to stream audio in phonegap app

I'm trying to implement audio calls in the phonegap app that I'm creating. I'm using PeerJS to get the 2 clients to connect. This part is working. But the problem is with the URL created for the media stream. It looks like this:

<audio src="blob:file%3A///6212fd77-7d1e-46ba-9abe-cfe96d414b28"></audio>

Instead of something like:

<audio src="blob:http%3A///url-to-server.com/6212fd77-7d1e-46ba-9abe-cfe96d414b28"></audio>

That is why nothing is heard from both devices (I'm testing with genymotion and my smartphone).

But the odd thing is that it works when I test from within the browser and my phone. When I speak with the built-in mic in my laptop I hear it on my phone. But when I speak in my phone nothing is heard on my laptop.

For reference this is what I'm getting when I select the audio element which plays the media stream:

<audio src="blob:http%3A//localhost%3A8100/861180b2-2c12-4134-a6da-89fcb40ef372"></audio>

Not really sure if this is actually the problem though.

I'm using ionic for developing the app and it integrates smoothly with crosswalk which basically just packages a recent chrome browser with your app so that it can use shiny new things without problems.

Here's the code for requesting for the mic:

function getAudio(successCallback, errorCallback){
    navigator.getUserMedia({
        audio: true,
        video: false
    }, successCallback, errorCallback);
}

And then I call it whenever someone initiates a call:

getAudio(
    function(MediaStream){

        console.log('now calling ' + to);
        var call = peer.call(to, MediaStream);
        call.on('stream', onReceiveStream);
});

Then onReceiveStream converts the media stream into a URL which is then assigned to the audio element:

function onReceiveStream(stream){
    var audio = document.querySelector('audio');
    audio.src = window.URL.createObjectURL(stream);
    audio.onloadedmetadata = function(e){
        console.log('now playing the audio');
        audio.play();
    }
}

Any ideas? Thanks in advance.

update

it seems the real problem is that the audio can't be captured from the phone because its not requesting to access the mic when navigator.getUserMedia is called. I've confirmed that navigator.getUserMedia is accessible when I tested with genymotion. Though its also not requesting for access in genymotion as well. I'm using recordRTC as a shim provider.

update 2

ok I gave up on this. Its not really possible to ask for the microphone. What I'm trying right now is PhoneRTC. But I'm stuck on a problem again. Its not building when I have the plugin installed. Please check out this issue on the projects Github page for more details: https://github.com/alongubkin/phonertc/issues/151

like image 291
Wern Ancheta Avatar asked May 05 '15 04:05

Wern Ancheta


1 Answers

To access phone resources you have to declare it on the AndroidManifest.xml, so the final user knows it when installs the app.

Cordova and or Phongap plugin build an application, even if you are not using them as a webview library, so you have to ask for permission in the manifest.

for microphone and camera the permissions are:

<uses-permission android:name="android.permission.CAMERA" />
<uses-permission android:name="android.permission.RECORD_AUDIO" />

but it seems that some plugins also does some tricks and needs one more:

<uses-permission android:name="android.permission.MODIFY_AUDIO_SETTINGS" />
like image 64
gaugeinvariante Avatar answered Nov 17 '22 19:11

gaugeinvariante