Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

What is scheduling jitter?

I've been reading a paper on real-time systems using the Linux OS, and the term "scheduling jitter" is used repeatedly without definition.

What is scheduling jitter? What does it mean?

like image 659
J. Polfer Avatar asked Aug 26 '09 17:08

J. Polfer


3 Answers

Jitter is the difference between subsequent periods of time for a given task. In a real time OS it is important to reduce jitter to an acceptable level for the application. Here is a picture of jitter.

Jitter

like image 103
Steven Eckhoff Avatar answered Nov 07 '22 22:11

Steven Eckhoff


Jitter is the irregularity of a time-based signal. For example, in networks, jitter would be the variability of the packet latency across a network. In scheduling, I'm assuming the jitter refers to inequality of slices of time allocated to processes.

Read more here http://en.wikipedia.org/wiki/Jitter

like image 35
djc Avatar answered Nov 07 '22 22:11

djc


Scheduling jitter is the maximum variance in time expected for program execution period

This concept is very important in real-time simulation systems. My experience comes from over 30 years in the real-time simulation industry (mostly Flight Simulation). Ideally absolutely no jitter is desirable, and that is precisely the objective of hard real-time scheduling.

Suppose that for example a real-time simulation needs to execute a certain computer program at 400 Hz in order to produce a stable and accurate simulation of that subsystem. That means we need to expect that the system will execute the program once every 2.5 msec. To achieve that in a hard real-time system, high-resolution clocks are used to schedule that module at a high priority so that the jitter is nearly zero. If this were a soft real-time simulation, some higher amount of jitter would be expected. If the scheduling jitter was 0.1 msec, then the starting point for that program would every 2.5 msec +/- 0.1 msec (or less). That would be acceptable as long as it would never take longer than 2.3 msec to execute the program. Otherwise the program could "overrun". If that ever happens, then determinism is lost, and the simulation looses fidelity.

like image 7
Richard Givis Avatar answered Nov 07 '22 22:11

Richard Givis