Saturating 1Gbps bandwidth

My ISP recently gave me a free speed bump from 500Mbps to 1 Gbps. I wanted to test if it was possible to fully utilise the bandwidth on a single file transfer. Firstly, we need to look at the test file size. I settled on a roughly 500MB to 1GB file size to minimise the effect of TCP window size scaling. To use an analogy, window size scaling is like when 2 joggers do not know each other's running pace, so they both start off with a slow jog. They then slowly increase the pace until it reaches the limit for one of the joggers, they then continue at that pace for the rest of the distance. The problem is when the distance is too short, that slow start is going to affect the average speed greatly, therefore the distance should be long enough. Similarly, the file size has to be large enough. In hindsight, I should have settled on a slightly larger file size, probably 3GB, since 500MB should theoretically take only 4 seconds to download on a 1Gbps connection.

Firstly, I tested out my local network. No problems here, 113MBps = 0.904Gbps, just shy of 1Gbps.

1
2
3
4
5
6
$ aria2c https://limbenjamin.com/files/ubuntu-14.04.4-server-i386.iso

Download Results:
gid   |stat|avg speed  |path/URI
======+====+===========+=======================================================
xxxxxx|OK  |   113MiB/s|/home/Benjamin/ubuntu-14.04.4-server-i386.iso

Next, speedtest.com.sg , 38Mbps = 0.304Gbps, only about a third of the speed. I am sticking to Singapore based servers because anything outside would incur huge speed penalties, making it impossible to hit the speed limit.

1
2
3
4
5
6
$ aria2c --split=8 http://www.speedtest.com.sg/test_random_1000mb.zip

Download Results:
gid   |stat|avg speed  |path/URI
======+====+===========+=======================================================
xxxxxx|OK  |    38MiB/s|/home/Benjamin/test_random_1000mb.zip

Ok, next up, mirror.nus.edu.sg. From past experience, I know that I can obtain very good speeds from this server. I used aria2c instead of the more common wget because it supports multiple simultaneous connections. Notice that 16 connections is slightly slower than 8 connections, probably takes a little longer to splice the parts together.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
$ aria2c http://mirror.nus.edu.sg/ubuntu-ISO/14.04.4/ubuntu-14.04.4-server-i386.iso

Download Results:
gid   |stat|avg speed  |path/URI
======+====+===========+=======================================================
xxxxxx|OK  |   6.5MiB/s|/home/Benjamin/ubuntu-14.04.4-server-i386.iso


$ aria2c --split=8 http://mirror.nus.edu.sg/ubuntu-ISO/14.04.4/ubuntu-14.04.4-server-i386.iso

Download Results:
gid   |stat|avg speed  |path/URI
======+====+===========+==========1=============================================
xxxxxx|OK  |    55MiB/s|/home/Benjamin/ubuntu-14.04.4-server-i386.iso


$ aria2c --split=16 http://mirror.nus.edu.sg/ubuntu-ISO/14.04.4/ubuntu-14.04.4-server-i386.iso

Download Results:
gid   |stat|avg speed  |path/URI
======+====+===========+=======================================================
xxxxxx|OK  |    52MiB/s|/home/Benjamin/ubuntu-14.04.4-server-i386.iso

It appears that 1 server might not be enough, how about downloading from 2 different servers, 8 connections per server.

1
2
3
4
5
6
$ aria2c --split=16 http://mirror.nus.edu.sg/ubuntu-ISO/14.04.4/ubuntu-14.04.4-server-i386.iso http://download.nus.edu.sg/mirror/ubuntu-releases/14.04.4/ubuntu-14.04.4-server-i386.iso

Download Results:
gid   |stat|avg speed  |path/URI
======+====+===========+=======================================================
xxxxxx|OK  |    51MiB/s|/home/Benjamin/ubuntu-14.04.4-server-i386.iso

Ok, just for fun, 4 different servers, 8 connections per server. I expected the speeds to take a huge dive because aria2c naively chops the file into 4 equal chunks for each server, since the last 2 servers are located overseas and are much much slower, we end up having to wait for them, thus explaining the poor performance.

1
2
3
4
5
6
$ aria2c --split=32 http://mirror.nus.edu.sg/ubuntu-ISO/14.04.4/ubuntu-14.04.4-server-i386.iso http://download.nus.edu.sg/mirror/ubuntu-releases/14.04.4/ubuntu-14.04.4-server-i386.iso http://mirror.pregi.net/ubuntu-cdimage/14.04.4/ubuntu-14.04.4-server-i386.iso http://mirror.umd.edu/ubuntu-iso/14.04.4/ubuntu-14.04.4-server-i386.iso

Download Results:
gid   |stat|avg speed  |path/URI
======+====+===========+=======================================================
xxxxxx|OK  |    32MiB/s|/home/Benjamin/ubuntu-14.04.4-server-i386.iso

At this point, my best result is 55MBps = 0.440Gbps, under half of the limit. It could be due to:

  1. Hardware can switch 1Gbps but cannot route 1Gbps, need to upgrade router
  2. Singtel not actually providing full 1Gbps bandwidth
  3. NUS servers throttle bandwidth for a single IP address

Yes, I can connect to multiple servers to download multiple different files at the same time, but most fast (>100mbps) speedtest sites only provide 100MB test files which complete in about 3 seconds, hence it is difficult to obtain reliable readings.

Now to think of it, I have never experienced speeds close to or above 55MBps. Not through downloads, torrents, SCP or any other transfer methods. Most of the time, the transfer speeds are dramatically lower due to the server bottleneck. Sometime back, I wrote about 2gbps fibre being just a gimmick, most of it was just theory, and today I have provided the actual findings. Maybe I should revise the title to 1gbps being a gimmick as well.