Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

What is the difference between the delay and the jitter in the context of real time applications?

According to Wikipedia Jitter is the undesired deviation from true periodicity of an assumed periodic signal, according to a papper on QoS that I am reading jitter is reffered to as delay variation. Are there any definition of the jitter in the context of real time applications? Are there applications that are sensitive to jitter but not sensitive to delay? If for example a streaming application use some kind of buffer to store packets before show them to the user, is it possible that this application is not sensitive to delay but is sensitive to jitter?

like image 977
Avraam Mavridis Avatar asked Dec 18 '13 09:12

Avraam Mavridis


People also ask

What is jitter and how does jitter impact real time audio video?

Jitter is measured in milliseconds (ms). A delay of around 30 ms or more can result in distortion and disruption to a call. For video streaming to work efficiently, jitter should be below 30 ms. If the receiving jitter is higher than this, it can start to slack, resulting in packet loss and problems with audio quality.

What is the difference between delay and jitter Mcq?

ANSWER: Delay is defined as the end to end time required for the signal to travel from transmitter to receiver and Jitter is defined as the variation of delay for packets belonging to the same flow.

What is the difference between jitter and timeliness?

In the case of video and audio, timely delivery means delivering data as they are produced, in the same order that they are produced, and without significant delay. This kind of delivery is called real-time transmission. Jitter: Jitter refers to the variation in the packet arrival time.

What is the relationship between jitter and latency?

The major distinction between jitter and latency is that latency is defined as a delay via the network, whereas jitter is defined as a change in the amount of latency. Increases in jitter and latency have a negative impact on network performance, therefore it's critical to monitor them regularly.


1 Answers

Delay: Is the amount of time data(signal) takes to reach the destination. Now a higher delay generally means congestion of some sort of breaking of the communication link.

Jitter: Is the variation of delay time. This happens when a system is not in deterministic state eg. Video Streaming suffers from jitter a lot because the size of data transferred is quite large and hence no way of saying how long it might take to transfer.

If your application is sensitive to jitter it is definitely sensitive to delay.

like image 199
Jovi Dsilva Avatar answered Oct 04 '22 10:10

Jovi Dsilva