Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Stream media file using WebRTC

Here is my use case: Alice has a cool new media track that she wants Bob to listen in to. She selects the media file in her browser and the media file starts playing instantly in Bob's browser.

I'm not even sure if this is possible to build using WebRTC API right now. All examples I can find use streams obtained via getUserMedia() but this is what I have:

var context = new AudioContext();
var pc = new RTCPeerConnection(pc_config);

function handleFileSelect(event) {
    var file = event.target.files[0];

    if (file) {
        if (file.type.match('audio*')) {
            console.log(file.name);
            var reader = new FileReader();

            reader.onload = (function(readEvent) {
                context.decodeAudioData(readEvent.target.result, function(buffer) {
                    var source = context.createBufferSource();
                    var destination = context.createMediaStreamDestination();
                    source.buffer = buffer;
                    source.start(0);
                    source.connect(destination);
                    pc.addStream(destination.stream);
                    pc.createOffer(setLocalAndSendMessage);
                });
            });

            reader.readAsArrayBuffer(file);
        }
    }
}

On the receiving side I have the following:

function gotRemoteStream(event) {
    var mediaStreamSource = context.createMediaStreamSource(event.stream);
    mediaStreamSource.connect(context.destination);
}

This code does not make the media (music) play on the receiving side. I do however receive an ended event right after the WebRTC handshake is done and the gotRemoteStream function was called. The gotRemoteStream function gets called the media does not start playing.

On Alice's side the magic is suppose to happen in the line that says source.connect(destination). When I replace that line with source.connect(context.destination) the media start playing correctly through Alice's speakers.

On Bob's side a media stream source is created based upon Alice's stream. However when the local speaker are connected using mediaStreamSource.connect(context.destination) the music doesn't start playing through the speakers.

Off course I could always send the media file through a DataChannel but where is the fun in that...

Any clues on what is wrong with my code or some ideas on how to achieve my use case would be greatly appreciated!

I'm using the latest and greatest Chrome Canary.

Thanks.

like image 636
Eelco Avatar asked Jul 04 '13 11:07

Eelco


2 Answers

It is possible to play the audio using the Audio element like this:

function gotRemoteStream(event) {
    var player = new Audio();
    attachMediaStream(player, event.stream);
    player.play();
}

Playing back the audio via the WebAudio API it not working (yet) for me.

like image 188
Eelco Avatar answered Oct 21 '22 23:10

Eelco


Note sure about Chrome; sounds like a bug.

try it on Firefox (nightly I suggest); we have WebAudio support there though I don't know all the details about what's supported currently.

Also, on Firefox at least we have stream = media_element.captureStreamUntilEnded(); we use it in some of our tests in dom/media/tests/mochitests I believe. This lets you take any audio or video element and capture the output as a mediastream.

Edit: see below; both Chrome and Firefox have misses in combining WebAudio with WebRTC PeerConnections, but in different places. Mozilla hopes to fix the last bug there very soon.

like image 25
jesup Avatar answered Oct 21 '22 23:10

jesup