I'm trying to get my head round WebRTC. I need to be able to capture and stream live audio through a web browser.
I'm just having difficulty finding the code examples that I can understand or is up-to-date. If anyone could help me with just first capturing and playing audio in the same browser with HTML5/WebRTC I think that would help me get started and along my way.
Note: I'm only concerned about getting this to work in Chrome (or Chrome Canary for that matter!).
Thanks for any help!
The HTML5 Rocks article on WebRTC is probably the best intro article that explains everything in layman's terms.
For simply capturing local video/audio, you'll want to focus on the MediaStream API (i.e., getUserMedia). Once you get that working, then you'll need to start looking into the RTCPeerConnection API.
The client-side code for the RTCPeerConnection API is pretty straightforward, but the server-side code required for signalling (i.e., establishing a peer-to-peer connection) can be tricky.
I ended up coding my own server-side solution in PHP, but to do so took me about three weeks of banging my head against the wall (i.e., trying to decipher the WebSocket specs) to get it to work properly. If you'd like to see actual code, I can post some of my working code.
If you're up for the challenge, I recommend trying to code the server-side script yourself, but otherwise, I would look into WebSocket libraries like Socket.IO, which do all the tricky server-side stuff for you.
If you are talking on WebRTC Live Audio Streaming/Broadcast, not just about peer-to-peer calls, WebRTC is not designed for broadcasts. Check here WebRTC - scalable live stream broadcasting / multicasting
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With