It's amazing to notice that live stream sites like doitlive.tv can deliver videos on very low bandwidth (as low as 25kbps) home connection. Could someone explain me the technology behind such sites and how they managed to deliver under such low bandwidth?
First, you use a video camera or webcam to capture video. Second, the video is sent to an encoder. This encoder may be located inside of a hardware encoder or through a software encoder. The encoder takes the RAW files from your camera and converts them into smaller files that can be streamed online.
The Real-time Streaming Protocol (RTSP), Real-time Transport Protocol (RTP) and the Real-time Transport Control Protocol (RTCP) were specifically designed to stream media over networks. RTSP runs over a variety of transport protocols, while the latter two are built on top of UDP.
There are various factors interfering with online stream such as Internet traffic jams. Video is more susceptible than audio to such interruptions. To avoid this, you can buffer the streamed content.
When the TV or streaming device runs out of video to play, it has to stop playing and re-fill its buffer with more video (“re-buffering"). Re-buffering is caused by changes in your Internet connection speed.
I've been working closely with a few major companies at work lately on this very issue. First and foremost, as already mentioned in other answers, a Content Delivery Network is utilized to provide optimum distribution.
A CDN is basically a worldwide cluster of servers that holds many copies of a single resource. So, when you request that resource from, say, New York City, you get the version of it that is PHYSICALLY closest to New York. There are many great explanations of how CDN works.
Your question about bandwidth involves a technology called Adaptive Bit Rate Streaming. Let's say you have a live broadcast streaming out to the web. As it's streaming, there is a piece of technology called a segmenter that chunks up the whole file into small packets that can be pieced together later. Each segment is encoded to various resolutions and capabilities. So, as you, the client, are asking for the files, the CDN can tell how strong the connection is. If it is low, the Adaptive Bit Rate Streaming gives you a lower resolution file. Have you ever noticed how when you start watching video online, it's low quality but improves over time? This is a perfect example of this technology. As your connection buffers further out and is fast, you get higher quality "segments". Hit up the websites of companies like Level 3 and Akamai and you can read a lot of their white papers on how it all works.
Here's an article on Adaptive Bit Rate Streaming.
Content Delivery Systems, and reducing file size.
The first one lets them have their files stored in servers in data centers around the world, thus having a lower transit and storage cost to the user. This means faster downloads/streaming.
The second comes in many forms. Most notably reducing resolution, adjusting the bitrate, and using advanced codecs. If you reduce a 640x480 resolution video to 320x240, you need aproximately 1/4th the space to store it. Likewise, using a lower bitrate makes a video blockier, but that loss of precision is also a loss in file size. Finally, more advanced codecs like h.264 can compress the same video, at the same visual quality, smaller than older or simpler codecs, like MPEG1 or DivX.
One simple approach that was used in the past, but may or may not be used today would be to use UDP instead of TCP. UDP has lower overhead. For things like a pdf or a program you need every byte, lose a few or a few packets worth and the whole thing is useless. For video and audio this is not the case, we tend to forgive or perhaps not notice dropped frames or some missing audio. UDP does not have the guaranteed reliability that TCP has, but for streaming that is okay, speed is more important than reliability, so long as it is good enough.
The most important reason for the ability to stream video today is the compression technology. Each new technology (h.264) or version (mpeg1, mpeg2,...) demands better quality video at the same bitrate or lower bitrates for the same quality or both, better quality at lower bitrates. The algorithm trades bandwidth for computation power both on the encoder and decoder. Ideally the bulk of the work is on the encoding side so that the decode algorithm can be simple. If you have ever tried to encode a good quality mpeg2 or h.264 or other type video it often takes significantly longer to encode the video than it does to play the video.
Add to that other networking tricks like the Content Delivery Network/System described in other answers to this question. The long and short of that is reducing the number of hops between you and a system with the content as well as spreading the overall delivery bandwidth of the content across many servers.
If it's about flash streaming, there are media servers like Adobe Flash Media Server, Wowza, and open-source Red5.
These are used to stream recorded or live streams over the web.
The bandwidth usage depends on video and sound codec.
You can see the codecs support by Adobe Flash here. Also you may want to check this Wikipedia article.
Those may give you an idea.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With