I'm trying to play MediaStream from remote peer (WebRTC) using Web Audio API. When i attach the stream to audio
element using audio.srcObject = stream
it plays ok, but when i try to use AudioContext it does not play any sound at all (I need to avoid audio/video
HTML tag).
This piece works:
<audio controls>
<script>
const audioEl = document.getElementsByTagName('audio')[0];
audioEl.srcObject = MY_STREAM;
audioEl.play();
</script>
This one does not:
const audioContext = new AudioContext();
const sourceNode = audioContext.createMediaStreamSource(MY_STREAM);
sourceNode.connect(audioContext.destination);
// Trying even 'audioContext.resume()' after user gesture with no luck
What is weird about that is that when MY_STREAM
is my micriphone then it plays nicely for Web Audio API (i hear the feedback from my mic).
So it would suggest that there is something different between microphone MediaStream and the one i get from WebRTC connection but why DOES it play for simple HTML audio
tag?
As demonstrated by @jib, this is a Chrome bug.
I have opened a new issue to let them know about it.
I thought I found a workaround by simply assigning this MediaStream to the srcObject
of a dummy HTMLAudioElement,
new Audio().srcObject = mediaStream;
but somehow, while testing on my localhost it didn't persist in time, while in this fiddle it does.
I also encountered a lot of other weird behavior while playing around, like different tabs having incidence on others and things like that.
Add to that other non-fixed bugs in the area that make me think of false positives and all-in-all, I fear there is no proper solution than waiting for them to fix it...
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With