Why shared high capacity link is better than parallel slow links
From the “African Internet Topology and Traffic Report“, written by Nemo Semret in 1998:
The most common mathematical model of a data communications link is the M/M/1 queue. In it, the average delay is given by T = 1/(C-r), where C is the capacity of the link (bits per second) and r is the average traffic arrival rate (bits per second). Suppose two similar neighbouring countries each have a long distance link with capacity C, and have traffic r. They will both experience the same average delay T. Now if they share a single link with capacity 2C, and send all their traffic 2r on it, they will both experience a delay of 1/(2C-2r) = T/2, i.e. half the delay they had when they both used their own link. This is called multiplexing gain, and it comes from the fact that, statistically, they won’t be sending data packets at exactly the same times, so there will be times when one is not sending, and the other may send and get the full high-speed link alone.Now since the cost of capacity 2C is generally less than double the cost of C (most of the cost of a link is independent of the capacity), it means that each provider spends less money, and gets double the performance!
Of course, reliability is a good reason for having multiple links. But still, it is clear that there is ample room for multiplexing gain in Africa.