Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to debug packet loss?

I wrote a C++ application (running on Linux) that serves an RTP stream of about 400 kbps. To most destinations this works fine, but some destinations expericence packet loss. The problematic destinations seem to have a slower connection in common, but it should be plenty fast enough for the stream I'm sending.

Since these destinations are able to receive similar RTP streams for other applications without packet loss, my application might be at fault.

I already verified a few things: - in a tcpdump, I see all RTP packets going out on the sending machine - there is a UDP send buffer in place (I tried sizes between 64KB and 300KB) - the RTP packets mostly stay below 1400 bytes to avoid fragmentation

What can a sending application do to minimize the possibility of packet loss and what would be the best way to debug such a situation ?

like image 966
Gene Vincent Avatar asked Apr 27 '10 17:04

Gene Vincent


1 Answers

Don't send out packets in big bursty chunks.

The packet loss is usually caused by slow routers with limited packet buffer sizes. The slow router might be able to handle 1 Mbps just fine if it has time to send out say, 10 packets before receiving another 10, but if the 100 Mbps sender side sends it a big chunk of 50 packets it has no choice but to drop 40 of them.

Try spreading out the sending so that you write only what is necessary to write in each time period. If you have to write one packet every fifth of a second, do it that way instead of writing 5 packets per second.

like image 136
Zan Lynx Avatar answered Sep 30 '22 12:09

Zan Lynx