Glossary // Network Latency
What is Network Latency?
With cloud networking, latency refers to the time it takes a packet of data to be transferred or routed across a network. The delay in delivery is, optimally, as close to zero as possible. Low-latency networks have short delays while high-latency networks have long delays. High-latency creates bottlenecks and causes unexpected delays or lags.
All networks have some degree of network latency. The amount of latency varies and can be temporary or persistent depending on what is causing the delays.
Impact of Routers and Distance on Cloud Network Latency
Routers or distance are typically where latency originates.
Routers copy packets as they travel between network interfaces. This causes a delay of only a few milliseconds. Latency occurs when packets have to travel through multiple routers. Additionally, when networks and routers are near capacity, a router will have to wait before transferring a packet to a network interface. While these delays are only milliseconds, they add up and result in network latency.
Distance also contributes to latency. Packets move across networks at about 100,000 miles per second. (For comparison, the speed of light in a vacuum is 186,282 miles per second). Despite this speed, a packet traveling around the earth would have at least 250 milliseconds of latency. Add to this the reality that network paths are rarely direct, which increases the distance that packets must travel.
Other Causes of Network Latency
- Transmission mediums have different speeds – e.g., wireless has more latency than optical fiber.
- Size impacts latency – i.e., larger packets take longer to receive and return than small ones.
- Signals that must be boosted by a repeater increase latency.
- Intermediate devices, such as switches and bridges, can cause latency when packets are subject to storage and hard disk delays at the end of the journey.
- Security systems, like anti-virus, cause latency when messages must be taken apart and reassembled before sending.