151k views
5 votes
Your company has a central server at the HQ. Many branches want to remotely (i.e., across a network) access the company data stored at the server and perform their own data analysis. Considering the following simple specifications, answer the questions: - To read data at a computer, the data needs to be read from a disk to main memory. Reading data stored in hard disks (from disk to memory) at the server has a transfer rate of 100 megabytes per second. Assume the same for your local computer. - Then, the read data need to be transferred on a network. The network data transfer (from memory of server to the memory of your local machine over the network) capacity is 10 megabytes per second. - Your analysis program at a branch has a size of 10 megabytes and can process 20 megabytes of data per second on the computer at the branch. - Assume that reading, network data transfer, program execution happen in a serial manner (no parallel operations assumed). - Transfer rime for any commands is so fast so ignored in the calculation.

a) How many seconds does it take to remotely read 2 gigabytes of data from the server and process it at your local machine? I.e., the sum of data read, transmission, and processing time.
b) If you can send your program to the server and execute it there with the same processing rate (20 MB/s), how many seconds will it take to process 2 gigabytes of data at the server?

2 Answers

1 vote

Final answer:

To remotely read and process 2 gigabytes of data, it would take approximately 143.36 seconds using local machine processing, while it would take only 20.48 seconds to process the data at the server.

Step-by-step explanation:

To remotely read 2 gigabytes of data from the server and process it at your local machine, we need to consider the time for reading, network data transfer, and processing. First, we calculate the time for reading the data from the server's disk to main memory at a transfer rate of 100 megabytes per second. 2 gigabytes is equivalent to 2048 megabytes, so it would take 2048 megabytes / 100 megabytes per second = 20.48 seconds to read the data. Next, we calculate the time for transferring the data over the network at a capacity of 10 megabytes per second. Since the transfer rate is slower than the reading rate, we can use the same time of 20.48 seconds for the data transfer. Finally, we calculate the time for processing the data at your local machine. The program can process 20 megabytes of data per second, so it would take 2048 megabytes / 20 megabytes per second = 102.4 seconds to process the data. Therefore, the total time to remotely read and process 2 gigabytes of data would be 20.48 seconds (reading) + 20.48 seconds (transfer) + 102.4 seconds (processing) = 143.36 seconds.

If you can send your program to the server and execute it there with the same processing rate of 20 megabytes per second, the time to process 2 gigabytes of data at the server would only include the time for reading the data. Following the same calculation as above, it would take 20.48 seconds to read the data from the server's disk to main memory. Therefore, the total time to process 2 gigabytes of data at the server would be 20.48 seconds.

5 votes

Final answer:

To remotely read 2 gigabytes of data from the server and process it at your local machine, it will take a total of 320 seconds. If you execute the program at the server with the same processing rate, it will take 100 seconds to process the data at the server.

Step-by-step explanation:

To remotely read 2 gigabytes of data from the server and process it at your local machine, we need to calculate the time for each step involved. First, we calculate the time to read the data from the server to the local machine's memory: 2 gigabytes at a transfer rate of 100 megabytes per second will take 20 seconds. Then, we calculate the time to transfer the data over the network: 2 gigabytes at a network transfer rate of 10 megabytes per second will take 200 seconds. Finally, we calculate the time to process the data at the local machine: 2 gigabytes at a processing rate of 20 megabytes per second will take 100 seconds. Adding up these times, it will take a total of 320 seconds to remotely read and process the data.

If you send your program to the server and execute it there with the same processing rate of 20 megabytes per second, the time to process 2 gigabytes of data at the server will only depend on the processing rate. Since the processing rate is the same, it will take 100 seconds to process the data at the server.

User Timothy Shields
by
8.2k points

Related questions

Welcome to QAmmunity.org, where you can ask questions and receive answers from other members of our community.