Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Send MediaStream object with Web Audio effects over PeerConnection

I'm trying to send audio, obtained by getUserMedia() and altered with the Web Audio API, over a PeerConnection from WebRTC. The Web Audio API and WebRTC seem to have the ability to do this but I'm having trouble understanding how this can be done. Within the Web Audio API, the AudioContext object contains a method createMediaStreamSource(), which provides a way to connect the MediaStream obtained by getUserMedia(). Also, there is a createMediaStreamDestination() method, which seems to return an object with a stream attribute.

I'm getting both audio and video from the getUserMedia() method. What I'm having trouble with is how would I pass this stream object (with both audio and video) into those methods (ex: createMediaStreamSource())? Do I first need to extract, somehow, the audio from the stream (getAudioTracks) and find a way to combine it back with the video? Or do I pass it as is and it leaves the video unaffected? Can the audio only be altered once (before added to the PeerConnection)?

like image 634
chRyNaN Avatar asked Oct 17 '14 18:10

chRyNaN


Video Answer


1 Answers

The createMediaStreamSource() method takes a MediaStream object as its parameter, which it then takes the first AudioMediaStreamTrack from this object to be used as the audio source. This can be used with the MediaStream object received from the getUserMedia() method even if that object contains both audio and video. For instance:

var source = context.createMediaStreamSource(localStream);

Where "context", in the above code, is an AudioContext object and "localStream" is a MediaStream object obtained from getUserMedia(). The createMediaStreamDestination() method creates a destination node object which has a MediaStream object within its "stream" attribute. This MediaStream object only contains one AudioMediaStreamTrack (even if the input stream to the source contained both audio and video or numerous audio tracks): the altered version of the track obtained from the stream within the source. For instance:

var destination = context.createMediaStreamDestination();

Now, before you can access the stream attribute of the newly created destination variable, you must create the audio graph by linking all the nodes together. For this example, lets assume we have a BiquadFilter node named filter:

source.connect(filter);
filter.connect(destination);

Then, we can obtain the stream attribute from the destination variable. And this can be used to add to the PeerConnection object to send to a remote peer:

peerConnection.addStream(destination.stream);

Note: the stream attribute contains a MediaStream object with only the altered AudioMediaStreamTrack. Therefore, no video. If you want video to be sent as well, you'll have to add this track to a stream object that contains a video track:

var audioTracks = destination.stream.getAudioTracks();
var track = audioTracks[0]; //stream only contains one audio track
localStream.addTrack(track);
peerConnection.addStream(localStream);

Keep in mind, that the addTrack method will not add the track if there is already one in the MediaStream object with the same id. Therefore, you may have to first remove the track that was obtained in the source node.

The sound should be able to be altered at any time by adjusting the values within the intermediate nodes (between the source and destination). This is because the stream passes through the nodes before being sent to the other peer. Check out this example on dynamically changing the effect on a recorded sound (should be the same for a stream). Note: I have not tested this code yet. Though it works in theory, there may be some cross browser issues since both the Web Audio API and WebRTC are in working draft stages and not yet standardized. I assume for it to work in Mozilla Firefox and Google Chrome.

Reference

  • Media Capture and Streams
  • Web Audio API
like image 199
chRyNaN Avatar answered Oct 04 '22 16:10

chRyNaN