Internet Service Providers, latency, customers, and how it all fits together

I have noticed a trend regarding customers and latency in the Internet Service Provider (ISP) world. Those in the industry know all about customers and speed tests. Love or hate them, the speed test is a way of life for the ISP. I wrote an article a while back on the Problems with Speed Tests. As customers become more educated, they are now paying attention to latency. We have many of the issues with latency as we do with speed tests.

First, let’s define what Latency is. Latency is the time it takes a packet to arrive at a destination. While a speed test measures how fast a payload can be delivered, latency is the time it takes the payload to reach a destination. If we put this in car terms. Latency is the time it takes to run a quarter mile, while a speed test measures how fast your car got up in that quarter mile. The folks over at have a great article on Latency vs throughput.

So, why is latency in networks significant? Latency affects how long a DNS lookup takes. Latency affects the sending of the network packet. The TCP protocol has mechanisms built to slow the speed at which packets are sent and received based on latency. This is a whole different discussion.

Voice Over IP (VOIP) is one use where latency affects the service. The more latent the connection is, the more delay in real-time voice there is. Anything real-time is highly dependent on the latency of the connection.

Another excellent example of latency is the water pipes in your home, especially hot water. For those with the traditional hot water heater, there is a delay from turning on the water until it reaches the faucet. The water has to travel from the water heater to the faucet. This can be expressed as latency. The amount of water you get is the same, it just takes longer to get water from the water heater.

So what affects latency? Physics is the number one cause. Let me explain. The further you are from something, the slower you will be to contact that thing. In our above example, the more distance pipes are from the water heater to the faucet, the more time it will take the water to reach the faucet. The same holds true for the internet. A person in Ohio requesting a website housed in New York will have better latency to that website than someone in Texas.

The type of connection can have a bearing on the latency. Wireless can introduce latency due to, you guessed it, physics. This same wireless can also improve latency over even fiber optic connection. Some High-frequency traders use the physics of wireless to gain milliseconds and microseconds over fiber. This is due to the speed of light inside the fiber tubes, and the distance traveled.

Latency is also influenced by the load. An internet connection that is 100% used will introduce latency to try and normalize connections. As mentioned above, the TCP protocol used on the internet will slow down a connection to make it more reliable. One of the ways it does this is to increase the time to respond. Thus increasing latency.

In the next article, I will explain some tools the ISP can use to deal with reports of slow latency.

j2networks family of sites
#packetsdownrange #routethelight
%d bloggers like this: