176k views
0 votes
A specific packet journey, from your computer to a remote server, end-to-end across multiple networks, has the following link bandwidths (packet travels across each of these links, in order): 1000 Mbps, 1 Mbps, 40 Mbps, 400 Mbps, 100 Mbps, 1000 Mbps. Assuming your computer and the server can transfer data at > = 1000 Mbps (so neither device is limiting the data transfer relative to the links), how long, in seconds, will it take to transfer a 125 MB file?

User Royhowie
by
4.9k points

1 Answer

5 votes

Answer:

0.125 seconds

Step-by-step explanation:

The formula for the time taken to transfer the file is = file size / bandwidth

Assuming the computer and server could only transfer data at the speed of 1000 Mbps or more, The time of transmission is;

= 125 MB / 1000 Mbps = 0.125 seconds.

User Sabri Mevis
by
5.1k points