Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Live streaming audio with WebRTC browser => server

I'm trying to sent some audio stream from my browser to some server(udp, also try websockets). I'm recording audio stream with webrtc , but I have problems with transmitting data from a nodeJS client to the my server. Any idea? is it possible to send audio stream to the server using webrtc(openwebrtc)?

like image 765
Lola_Padilya Avatar asked Jan 16 '18 14:01

Lola_Padilya


People also ask

Can I use WebRTC for live streaming?

WebRTC leverages three HTML5 APIs enabling browsers to capture, encode, and transmit live streams. While streaming workflows can often require an IP camera, encoder, and streaming software, the most basic WebRTC use-cases can manage the whole enchilada with just a webcam and browser.

Does WebRTC need a server?

Does WebRTC Need a Server? WebRTC can easily connect two browsers on a local area network. However, WebRTC and browsers alone aren't capable of connecting through the internet. WebRTC needs a server to handle tasks like getting through firewalls and routing data outside of your local network.

What is RTC in streaming?

WebRTC is an open technology specification for enabling real-time communication (RTC) across browsers and mobile applications via simple APIs. It uses peering techniques for real-time data exchange between connected peers and provides low latency media streaming required for human-to-human interaction.

Is WebRTC a client server?

You can use WebRTC with a node server, but WebRTC is really a protocol for persistent communication between two clients. Using socketIO will set up a persistent connection between a client and your server.


1 Answers

To get audio from the browser to the server, you have a few different possibilities.

Web Sockets

Simply send the audio data over a binary web socket to your server. You can use the Web Audio API with a ScriptProcessorNode to capture raw PCM and send it losslessly. Or, you can use the MediaRecorder to record the MediaStream and encode it with a codec like Opus, which you can then stream over the Web Socket.

There is a sample for doing this with video over on Facebook's GitHub repo. Streaming audio only is conceptually the same thing, so you should be able to adapt the example.

HTTP (future)

In the near future, you'll be able to use a WritableStream as the request body with the Fetch API, allowing you to make a normal HTTP PUT with a stream source from a browser. This is essentially the same as what you would do with a Web Socket, just without the Web Socket layer.

WebRTC (data channel)

With a WebRTC connection and the server as a "peer", you can open a data channel and send that exact same PCM or encoded audio that you would have sent over Web Sockets or HTTP.

There's a ton of complexity added to this with no real benefit. Don't use this method.

WebRTC (media streams)

WebRTC calls support direct handling of MediaStreams. You can attach a stream and let the WebRTC stack take care of negotiating a codec, adapting for bandwidth changes, dropping data that doesn't arrive, maintaining synchronization, and negotiating connectivity around restrictive firewall environments. While this makes things easier on the surface, that's a lot of complexity as well. There aren't any packages for Node.js that expose the MediaStreams to you, so you're stuck dealing with other software... none of it as easy to integrate as it could be.

Most folks going this route will execute gstreamer as an RTP server to handle the media component. I'm not convinced this is the best way, but it's the best way I know of at the moment.

like image 162
Brad Avatar answered Sep 29 '22 06:09

Brad