Network Performance Measurements: Bandwidth, Throughput & Latency

Network performance measurement - speed, bandwidth, throughput and latency

There are a number of terms that are used to refer to various aspects of network performance. Some of them might be familiar to you also. You will ofter see the use of these terms in any networking book or blog. Before going deep into the networking world, we should properly understand what these terms mean. In this blog, we will understand about Network Performance Measurements like Speed, Bandwidth, Throughput, and Latency.

Read about Circuit-switching and Packet-switching network.

Speed

This is the most generic term used in terms of performance measurement of a network. The most common speed of a network refers to the rated or nominal speed of particular networking technology.

Example: Fast Ethernet has a rated speed of 100 Mbps, also called 100 Mbps Ethernet (100BASE-TX).

One thing to keep in mind is that no networking technology can run at its full-rated speed, and many run substantially below it. It’s mainly because of real-world performance factors.

In the case of Fast Ethernet, a rated speed of 100Mbps is also referred to as the throughput of a technology.

Bandwidth

Bandwidth refers to the data-carrying capacity of a network or data-transmission medium.

This is a widely used term in the networking field indicating the maximum amount of data that can pass from one point to another in a unit of time.

Throughput

Earlier we saw Bandwidth of a networking medium is the maximum amount of data that can be transferred between a network. Throughput is a measure of how much the actual data can be sent per unit of time across a network, channel or interface.

We can say that both the Bandwidth and Throughput are theoretical terms but Throughput is more often used in a practical sense.

Example: Let’s say we are using a Fast Ethernet cable of rated-speed 100Mbps, the upper limit of the medium i.e the rated-speed is commonly the bandwidth of the cable while the real-world capacity of the medium to carry information, let’s say 80.2 Mbps will be the Throughput of the medium.

The terms bandwidth and throughput are often used interchangeably, but they are not exactly the same. Bandwidth is more theoretical and throughput is used in the practical real-world.

Latency

Latency is the timing of data transferred on a communication channel, medium, or network.

In simple words, Latency is how long it takes from the time a request for data is made until it starts to arrive.

Low Latency in a network is considered better than high latency.

These terms are used in various places in the networking field. In general, speed, bandwidth, and throughput get a lot of attention, while latency gets little.

But Latency is given the most attention in real-world applications like audio and video streaming and interactive games, also to decrease the load time of a website.

Learn more about network performance measurement terms from wikipedia.

Tags: , , ,

Leave a Reply

Your email address will not be published. Required fields are marked *