195k views
1 vote
Another pitfall cited in Section 1.10 is expecting to improve the overall performance of a computer by improving only one aspect of the computer. Consider a computer running a program that requires 250 s, with 70 s spent executing FP instructions, 85 s executed L/S instructions, and 40 s spent executing branch instructions. 1.13.2 [5] <1.10> By how much is the time for INT operations reduced if the total time is reduced by 20%?

1 Answer

1 vote

Answer:


5.6\:\%

Step-by-step explanation:

Given:

Let Floating Point to FP and the execution time of the FP = 70s

The total time of the CPU = 250s

Execution time of the Branch instruction = 40s

Execution time of the L/S instruction = 85s

Then, we have to decrease the FP instruction time.


0.8 * 70s = 56s

Then, we have to find the total time required to run the program.


250-(70-56)=236s


Then,\: we\:decrease\:the\:total\:time=The\:total\:time\:of \:the\:CPU-\:total\:time


250s\:-\:236s= 14s

Finally, the percentage of the decreased total time.


(14s\setminus 250s)* 100


=5.6\:\%

User Ryan Erwin
by
6.7k points