45.9k views
5 votes
• Suppose Host A wants to send a large file to Host B. The path from Host A to Host B has three links, of rates R1= 500 kbps, R2= 2 Mbps, and R3= 1 Mbps. o Assuming no other traffic in the network, what is the throughput for the file transfer? o Suppose the file is 4 million bytes. Dividing the file size by the throughput, roughly how long will it take to transfer the file to Host B? g

1 Answer

5 votes

Answer:

It will take 64 seconds to transfer the file from Host A to Host B.

Step-by-step explanation:

The throughput for this network is the link with the minimum rate from R1, R2 and R3.

Throughput = 500 kbps

Time taken to transfer file = File Size/Throughput

The units of both file size and throughput must be same so we will convert 4 million bytes into bits. (Note: 1 byte has 8 bits)

4000000*8 = 32000000 bits = 32 million bits

Time taken to transfer file = 32 million bits/500 kbps

= 32000000/500000

Time taken to transfer file = 64 seconds

It will take 64 seconds to transfer the file from Host A to Host B.

User Scabbia
by
3.6k points