Answer:
a) 500 Kbps b) 64 sec c) 320 sec
Step-by-step explanation:
a) We define the throughput of a network, as the actual maximum transmission rate that the network is able to deliver, which in this case is equal to the lowest transmission rate of any of the links that the traffic must go through:
R1 =500 kbps
b) If the file size is given in bytes, and we have the throughput in bps, we need to convert to bits first, as follows:
4*10⁶ bytes * (8 bits/byte) = 32*10⁶ bits.
The time needed to transfer the file, will be given by the quotient between the file size and the throughput, as follows:
t = 32*10⁶ bits / 500*10³ bits/sec = 64 sec
c) If the transmission rate R2 is reduced to 100 kbps, R2 becomes the lowest transmission rate in the network, so it becomes the new throughput.
So, the time needed for the same file to be transferred to host B is as follows:
t= 32*10⁶ bits / 100*10³ bits/sec = 320 sec