Simple Throughput Formula
To calculate throughput in megabits per second (Mbps), divide 8192 by the time it takes to copy a 1 gigabyte file (2^30 bytes, or 1,073,741,824).
v = 8,192 / t
v – Throughput in Mbps
t – Time in seconds, that it takes to copy a 1 gigabyte file
This simple throughput test allows you to quickly check actual throughput for storage or network connections, and can be a valuable troubleshooting tool.
A 1 gig file takes 300 seconds to copy.
8,192 / 300 = 27.3 Mbps (Megabits per second)
How does this work?
Throughput is the number of bits transferred, in a specified amount of time (in seconds).
File size is measured in bytes, while throughput is measured in bits per second. As there are 8 bits in a byte, the first step is to multiply the file size by 8, to get the total number of bits.
If we have a 1 gig file, that’s 1,073,741,824 bytes, or 8,589,934,592 total bits to transfer.
If we divide at this point, we will get a really large number, representing “bits per second” (bps).
When dealing with network or internet throughput rates, we typically use a much larger scale, such as megabits per second. A kilobit is 1024 bits, and a megabit is 1024 kilobits (or 1024 * 1024 bits). To go from bits to megabits, we divide by 1024 * 1024 ( 1,048,576 ).
So for a given file size f, taking time t in seconds to copy, the throughput v is as follows:
v = ( f * 8 / ( 1024 * 1024 ) ) / t)
v = ( f * 8 ) / ( 1024^2 * t )
With a file size of 1 gigabyte (1024 * 1024 * 1024 bytes), the equation looks like this:
v = ( 1024^3 * 8 ) / ( 1024^2 * t )
Since 1024^3 / 1024^2 = 1024, the above can be simplified to:
v = ( 1024 * 8 ) / t
v = 8,192 / t
To specify throughput in Gbps (Gigabits per second), simply divide 8 by the time in seconds. Since you would be dividing 8192 by 1024 to go from megabits to gigabits, the answer, 8 represents the original 1 gig (abyte) file converted to bits, which is simply 1 (the file size) times 8 (the number of bits in a byte).
v (Gbps) = 8 / t
The problem is that, as the time taken to copy the file approaches 1 second or less, you lose precision.
Let’s say your 1 gig file takes 0.9 seconds. The throughput is 8 / 0.9 = 8.9 Gbps. The problem is that measuring 9/10 of one second, which is 900 milliseconds, takes a sensitive and precise timer.
One way around this is to scale everything up – for example, using a 10 gig file means we simply multiply the number of gigabits by 10:
v (Gbps) = 80 / t
Now, our 10 gig file might take 9 seconds, a quantity of time we can easily count using a stopwatch, but yields the same result: 80 / 9 = 8.9 Gbps.
As storage and networking become faster, larger file sizes are required, to make an accurate measurement.
How Do I Create a 1 Gig File?
In Windows, use these commands at a command prompt:
echo ABCDEF>test.txt for /L %i in (1,1,27) do type test.txt >> test.txt
The first line creates an 8 byte file. 6 bytes are in the string “ABCDEF”, followed by a Carriage Return and Line Feed, that you don’t see, are also stored in the file.
The second line counts from 1 to 27, and adds the file to the bottom of itself, doubling the file each time.
Since the original file is 2^3 bytes, if we double it, the resulting file is 2^4 bytes. If we double it 27 times, the file size is 2^(3 + 27), or 2^30, which is exactly 1 gig.