Here is my use case: Alice has a cool new media track that she wants Bob to listen in to. She selects the media file in her browser and the media file starts playing instantly in Bob's browser.
I'm not even sure if this is possible to build using WebRTC API right now. All examples I can find use streams obtained via getUserMedia() but this is what I have:
var context = new AudioContext();
var pc = new RTCPeerConnection(pc_config);
function handleFileSelect(event) {
var file = event.target.files[0];
if (file) {
if (file.type.match('audio*')) {
console.log(file.name);
var reader = new FileReader();
reader.onload = (function(readEvent) {
context.decodeAudioData(readEvent.target.result, function(buffer) {
var source = context.createBufferSource();
var destination = context.createMediaStreamDestination();
source.buffer = buffer;
source.start(0);
source.connect(destination);
pc.addStream(destination.stream);
pc.createOffer(setLocalAndSendMessage);
});
});
reader.readAsArrayBuffer(file);
}
}
}
On the receiving side I have the following:
function gotRemoteStream(event) {
var mediaStreamSource = context.createMediaStreamSource(event.stream);
mediaStreamSource.connect(context.destination);
}
This code does not make the media (music) play on the receiving side. I do however receive an ended event right after the WebRTC handshake is done and the gotRemoteStream function was called. The gotRemoteStream function gets called the media does not start playing.
On Alice's side the magic is suppose to happen in the line that says source.connect(destination). When I replace that line with source.connect(context.destination) the media start playing correctly through Alice's speakers.
On Bob's side a media stream source is created based upon Alice's stream. However when the local speaker are connected using mediaStreamSource.connect(context.destination) the music doesn't start playing through the speakers.
Off course I could always send the media file through a DataChannel but where is the fun in that...
Any clues on what is wrong with my code or some ideas on how to achieve my use case would be greatly appreciated!
I'm using the latest and greatest Chrome Canary.
Thanks.
It is possible to play the audio using the Audio element like this:
function gotRemoteStream(event) {
var player = new Audio();
attachMediaStream(player, event.stream);
player.play();
}
Playing back the audio via the WebAudio API it not working (yet) for me.
Note sure about Chrome; sounds like a bug.
try it on Firefox (nightly I suggest); we have WebAudio support there though I don't know all the details about what's supported currently.
Also, on Firefox at least we have stream = media_element.captureStreamUntilEnded(); we use it in some of our tests in dom/media/tests/mochitests I believe. This lets you take any audio or video element and capture the output as a mediastream.
Edit: see below; both Chrome and Firefox have misses in combining WebAudio with WebRTC PeerConnections, but in different places. Mozilla hopes to fix the last bug there very soon.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With