VOIP solutions and video chat apps are part of our business and personal lives – and online multiplayer games have become mainstream hobbies. To enable these applications, huge volumes of data are being exchanged online at ever-more-massive rates. A delay in the transmission of data – i.e. latency – can have an enormous impact on user experience
Analogous to road transport, having more trucks on the road will result in a higher throughput, but will have no effect on the travel time. Even a little bit of latency is noticeable – and irritating –, which is why it’s key to keep it to a minimum.
So, network latency describes the delays associated with transmitting data packages from source to destination through a network. By contrast, bandwidth refers to a network’s max data transfer rate over a certain amount of time. While the effects of both latency and bandwidth bottlenecks can look similar and even magnify each other, their causes are different.
Network latency in action: obvious from 200 milliseconds
To demonstrate the effects of latency, let’s load our website www.excentis.com, but we’ll add a fraction of latency to the network. The image on the top shows the results of 10ms latency, the middle image shows 60 milliseconds and the bottom image shows 510 milliseconds.
With only 60 milliseconds of latency, the site loads noticeably slower than the first. When the loading time reaches 2s or more, users will very likely abandon the session. Attempting the same demonstration by limiting bandwidth results in no difference. Loading a map in Google Maps, for example, will take significantly more time on a high-latency network vs. a low-latency one, even though the data throughput is the same.
Browsers deploy a few tricks to minimize this impact. For example, they fetch content simultaneously. If the site you’re visiting contains many or very large images, they can all be downloaded together. Modern browsers maintain persistent connections with high-data-volume websites (persistent connection model), or request data multiple times in succession without waiting for a response from the network (HTTP pipelining). These approaches are great for browsers, but they can’t be used in other applications.
What causes latency
- An electrical signal cannot go faster than the speed of light. And even optical signals sent through fiber are much slower than when moving through vacuum. Of course, this effect is barely noticeable within a LAN, because of the short distances between devices. However, when transferring data across the world, the physical propagation delay plays a very important role.
- Each device between server and client adds some delay. The internet consists of billions of routers, switches, CMTSes, cable modems, and lots of other devices. Each one of these, even your own network card, contributes to the total latency. Buffering and processing speed determine how fast the packets can be transmitted.
- Unresponsive servers will also have a huge effect on the user experience. When a server is overloaded, it will respond much slower.
Some key definitions: jitter and lag
There are some important terms to be aware of when monitoring and assessing network latency levels. Jitter, for example, is the variability over time of the latency across a network. A network with constant latency has zero jitter. For applications that require an uninterrupted playout, buffers are required to compensate for the jitter caused by the network. Note that these buffers increase the latency experienced by the user. For realtime applications this puts a limit on the maximum buffer that can be used.
Latency in gaming, often called lag, describes a delay in the perception of game events. A player with a high-latency internet connection might show slow responses despite quick reaction time. As a result, players with low-latency connections have tactical advantages over other players, as they are able to respond in-game closer to real time. Lag is detectable from as little as 30 milliseconds in gaming.
Don’t lag behind on monitoring latency
Since latency affects network users in a variety of different ways, it’s key for operators to measure and monitor latency. We’ll explain how to do that in a second blog on latency.
Have questions about latency testing, or need help achieving your network goals? Get in touch with us for assistance.