162k views
1 vote
Suppose Host A wants to send a large file to Host B. The path from Host A to Host B has three links, of rates R1 = 500 kbps, R2 = 2 Mbps, and R3 = 1 Mbps.

a. Assuming no other traffic in the network, what is the throughput for the file transfer?
b. Suppose the file is 4 million bytes. Dividing the file size by the throughput, roughly how long will it take to transfer the file to Host B?
c. Repeat (a) and (b), but now with R2 reduced to 100 kbps.

1 Answer

5 votes

Answer:

a) 500 Kbps b) 64 sec c) 320 sec

Step-by-step explanation:

a) We define the throughput of a network, as the actual maximum transmission rate that the network is able to deliver, which in this case is equal to the lowest transmission rate of any of the links that the traffic must go through:

R1 =500 kbps

b) If the file size is given in bytes, and we have the throughput in bps, we need to convert to bits first, as follows:

4*10⁶ bytes * (8 bits/byte) = 32*10⁶ bits.

The time needed to transfer the file, will be given by the quotient between the file size and the throughput, as follows:

t = 32*10⁶ bits / 500*10³ bits/sec = 64 sec

c) If the transmission rate R2 is reduced to 100 kbps, R2 becomes the lowest transmission rate in the network, so it becomes the new throughput.

So, the time needed for the same file to be transferred to host B is as follows:

t= 32*10⁶ bits / 100*10³ bits/sec = 320 sec

User Pedro Alencar
by
5.9k points