Ask Question
19 October, 14:30

Suppose a program takes 1000 machine instructions to run from start to end, and can do that in 10 microseconds when no page faults occur. How long will this same program take if 1 in every 100 instructions has a page fault and each page fault takes 100 milliseconds to resolve?

+1
Answers (1)
  1. 19 October, 17:49
    0
    (10^6 + 9.9)

    Explanation:

    Given:

    Total number of machine instructions = 1000

    Number of page fault in 100 instructions = 1

    Number of page faults in 1000 instructions = 10

    Time to serve one page fault = 100 milliseconds

    Time to serve ten page faults = 100*10 milliseconds = 1000 milliseconds = 10^6 Microseconds

    Number of instructions without any page fault = 1000 - 10 = 990

    Time required to run 1000 instructions = 10 Microseconds

    So, time required to run 990 instructions = (10 * (990/1000)) Microseconds = 9.9 Microseconds

    So, the total time required to run the program = (10^6 + 9.9) Microseconds
Know the Answer?
Not Sure About the Answer?
Find an answer to your question ✅ “Suppose a program takes 1000 machine instructions to run from start to end, and can do that in 10 microseconds when no page faults occur. ...” in 📘 Computers and Technology if you're in doubt about the correctness of the answers or there's no answer, then try to use the smart search and find answers to the similar questions.
Search for Other Answers