Jitter Definition
Jitter is any deviation in, or displacement of, the signal pulses in a high-frequency digitalsignal. The deviation can be in terms of amplitude, phase timing or the width of the signal pulse. Among the causes of jitter are electromagnetic interference (EMI) and crosstalk with other signals. Jitter can cause a display monitor to flicker, affect the ability of the processor in a desktop or server to perform as intended, introduce clicks or other undesired effects in audio signals, and loss of transmitted data between network devices. The amount of allowable jitter is highly dependent on the application.
Jitter in IP networks is the variation in the latencyon a packet flow between two systems, when some packets take longer to travel from one system to the other. Jitter results from network congestion, timing drift and route changes.
Jitter is especially problematic in real-time communications like IP telephony and video conferencing. It is also a serious problem for hosted desktops and virtual desktop infrastructure (VDI). Jitter can lead to audio and video artifacts (unintended deviation or inconsistency) that degrade the quality of communications.
A jitter buffer (or de-jitter buffer) can mitigate the effects of jitter, either in the network on a router or switch, or on a computer. The application consuming the network packets essentially receives them from the buffer instead of directly. They are fed out of the buffer at a regular rate, smoothing out the variations in timing of packets flowing into the buffer.
Other techniques for mitigating jitter where multiple pathways for traffic are available is to selectively route traffic along the most stable paths, or to always pick the path that can come closest to the targeted packet delivery rate.