66.6k views
5 votes
Suppose a program takes 1000 machine instructions to run from start to end, and can do that in 10 microseconds when no page faults occur. How long will this same program take if 1 in every 100 instructions has a page fault and each page fault takes 100 milliseconds to resolve?

User Wberry
by
5.5k points

1 Answer

4 votes

Answer:

(10^6 + 9.9)

Step-by-step explanation:

Given:

Total number of machine instructions = 1000

Number of page fault in 100 instructions = 1

Number of page faults in 1000 instructions = 10

Time to serve one page fault = 100 milliseconds

Time to serve ten page faults = 100*10 milliseconds = 1000 milliseconds = 10^6 Microseconds

Number of instructions without any page fault = 1000 - 10 = 990

Time required to run 1000 instructions = 10 Microseconds

So, time required to run 990 instructions = (10*(990/1000)) Microseconds = 9.9 Microseconds

So, the total time required to run the program = (10^6 + 9.9) Microseconds

User Nadean
by
5.5k points