If you are trying to develop an interactive livestream application, you rely on ultra low (real-time) latency. For example for a video conference or a remote laboratory.
The two protocols, which should be suitable for this circumstances are:
*WebRTC: As I'm trying to give a bigger audience the possibility to interact with each other, WebRTC is not suitable. Because as far as I know it is not designed for a bigger audience.
My questions:
Which one should I choose for this use-case? RTSP/RTP or RTMP?
Which protocol delivers better results regarding end-to-end latency, session start-up time?
Which one consumes more hardware resources?
RTMP seems to use a persistent TCP connection. But which protocol is used for the transmission? It cannot be TCP, because this could not ensure real-time latency?
What are in general the pros and cons for using either of the protocols?
I did not find any comparison of these two protocols in scientific papers or books. Only that the famous mobile live-streaming app Periscope is using RTMP.
Other apps like Instagram or Facebook are for example providing text-based interaction with the streamer. If developers want to build the next "killer application" based on interactive live-streams: I think this question is essential to answer.
You make a lot of assumptions in your answer.
WebRTC: As I'm trying to give a bigger audience the possibility to interact with each other, WebRTC is not suitable. Because as far as I know it is not designed for a bigger audience.
That's simply not true. WebRTC doesn't know or care how you structure your applications server-side. There are plenty of off-the-shelf services for handling large group calls and low latency video distribution via WebRTC.
You should also know that for the media streams, WebRTC is RTP under the hood.
It cannot be TCP, because this could not ensure real-time latency?
Of course it can. There's some overhead with TCP, but nothing that prevents you from using it in a real time scenario. The overhead with TCP is minimal.
UDP is traditionally used for these sorts of scenarios, as reliability isn't required, but that doesn't mean TCP can't be used almost as performantly.
RTMP
RTMP is a dead protocol for Flash. No browsers support it. Other clients only support it for legacy reasons. You shouldn't use it for anything new going forward.
Only that the famous mobile live-streaming app Periscope is using RTMP.
Well, that's not a reason to do much of anything.
- Which protocol delivers better results regarding end-to-end latency, session start-up time?
WebRTC
- Which one consumes more hardware resources?
That's not the right question to ask. Your overhead in almost any other parts of the application is going to be far more than the transport overhead of the protocol used for distribution.
The real list of things you need to think about:
You might also find my post here helpful: https://stackoverflow.com/a/37475943/362536
In short, check your assumptions. Understand the tradeoffs. Make decisions based on real information, not sweeping generalizations.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With