So I'm looking to build a chat app that will allow video, audio, and text. I spent some time researching into Websockets and WebRTC to decide which to use. Since there are plenty of video and audio apps with WebRTC, this sounds like a reasonable choice, but are there other things I should consider? Feel free to share your thoughts.
Things like:
Due to being new WebRTC is available only on some browsers, while WebSockets seems to be in more browsers.
Scalability - Websockets uses a server for session and WebRTC seems to be p2p.
Multiplexing/multiple chatrooms - Used in Google+ Hangouts, and I'm still viewing demo apps on how to implement.
Server - Websockets needs RedisSessionStore or RabbitMQ to scale across multiple machines.
WebRTC doesn't use WebSockets. It has its own set of protocols including SRTP, TURN, STUN, DTLS, SCTP, … The thing is that WebRTC has no signaling of its own and this is necessary in order to open a WebRTC peer connection. This is achieved by using other transport protocols such as HTTPS or secure WebSockets.
WebRTC vs WebSockets – Key DifferencesWebRTC is more secure; At the moment, WebRTC is only supported by certain browsers while WebSockets is compatible with almost all existing browsers; In terms of scalability, WebSockets uses a server per session approach and WebRTC is peer-to-peer.
Avoid using WebSockets if only a small number of messages will be sent or if the messaging is very infrequent. Unless the client must quickly receive or act upon updates, maintaining the open connection may be an unnecessary waste of resources.
Websockets are highly faster than WebRTC !
WebRTC is designed for high-performance media transfer. WebRTC web applications run through a service or transport, through which they exchange network and media data. WebSocket, by contrast, is for communication between client and server.
It is possible to stream audio and video over WebSocket (see here for example), but the technology and APIs are not inherently designed for efficient, robust streaming in the way that WebRTC is. As other replies have said, WebSocket can be used for signaling.
WebRTC is designed for high-performance, high quality communication of video, audio and arbitrary data. In other words, for apps exactly like what you describe. WebRTC apps need a service via which they can exchange network and media metadata, a process known as signaling.
To coordinate communication, WebRTC clients need some sort of “ signaling server ” in between, for exchanging metadata information.
WebRTC is designed for high-performance, high quality communication of video, audio and arbitrary data. In other words, for apps exactly like what you describe.
WebRTC apps need a service via which they can exchange network and media metadata, a process known as signaling. However, once signaling has taken place, video/audio/data is streamed directly between clients, avoiding the performance cost of streaming via an intermediary server.
WebSocket on the other hand is designed for bi-directional communication between client and server. It is possible to stream audio and video over WebSocket (see here for example), but the technology and APIs are not inherently designed for efficient, robust streaming in the way that WebRTC is.
As other replies have said, WebSocket can be used for signaling.
I maintain a list of WebRTC resources: strongly recommend you start by looking at the 2013 Google I/O presentation about WebRTC.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With