What is Jitter?

what is jitter

In telecommunication transmission systems, ensuring the quality of Service (QoS) is one of the main challenges. QoS, in this case, is used to refer to a set of technologies used to manage data traffic to achieve maximum bandwidth and to reduce network performance issues such as packet loss, error rate, and jitter on the network. Jitter, by and large, arises when there is a deviation from the true periodicity of a seemingly periodic signal, in relation to a given reference clock signal. It refers to variation in network latency, or rather the small intermittent delays that occur during data transfer.

Usually, there is always some form of latency whenever signals are transmitted. When sending a message, packets are routed individually and received in a queue at the destination network device. Regrettably, it is difficult to guarantee constant delay, which brings about the concern of jitter or delay inconsistency. In real-time applications such as streaming, online gaming, digital voice communication, and so on, it can be problematic. A good example is voice traffic in a VoIP network environment. At the origin, packets will be sent at regular intervals, for example, one frame per 10 milliseconds. Where some packets delay more than others or follow different routes, they will arrive at the destination out of order.

Why does Jitter occur?

There are several reasons why jitter occurs, with the two main reasons being variation in routes as well as congestion. When there is congestion in a network, it brings about queuing at routers, and in so doing, delays the packets. As a result, whereas messages are sent at regular rates, they arrive at uneven intervals. It follows that each packet has its own delay/latency while traversing the network, and hence, the network latency is an average of individual delays. As pointed out, there is no guarantee that all packets will follow the same route to a destination. For that reason, packets traveling in a given network may arrive out of order.

Jitters are either random or deterministic, with the latter bounded and the former unbounded.  By being unbounded, if one waits for long enough time, the peak-to-peak jitter increases indefinitely, therefore making it very difficult to predict. On the flip side, deterministic jitters are bounded and have absolute maximum and minimum values. As such, they are easy to predict and reproduce. A blend of the random and deterministic jitters is known as total jitters. As regards to measurement, they are categorized as period, phase, long term, time interval error, and cycle-to-cycle period.

As mentioned earlier, they have considerable impacts. First, they can cause packet loss. In the event packets arrive irregularly, the receiving system tries to make up for that by also attempts to correct them. In some cases, the receiver may not be able to process all incoming packets, and in such cases, packets can be lost. This is worse in real-time services. The other issue caused by jitters is network congestion. This occurs when the amount of traffic arriving at a network device exceeds what it can send. As a result, some packets may be lost, while others may delay arriving at the receiver. Usually, if the receiving exceeds 15-20 milliseconds, the latency can increase and cause packet loss, and in so doing, lower the quality of the signal received.

To alleviate jitter, use two types of buffers, namely static and dynamic buffers. The Static buffers, as the name implies, are in-build in the hardware of a system, usually configured by the manufacturer. Dynamic buffers, on the other hand, are typically are implemented within the software of the system and are configured by a network administrator.

If your current system frequently experiences interruption and disruption to services, contact us to upgrade to a seamless and rock-solid solution!

Reddit
Facebook
Twitter
LinkedIn