Ask Question
12 January, 03:21

Calibrations on a recent version of an operating system showed that on the client side, there is a delay of at least 0.5 ms for a packet to get from an application to the network interface and a delay of 1.4 ms for the opposite path (network interface to application buffer). The corresponding minimum delays for the server are 0.20 ms and 0.30 ms, respectively.

What would be the accuracy of a run of the Cristian's algorithm between a client and server, both running this version of Linux, if the round trip time measured at the client is 6.6 ms?

+5
Answers (1)
  1. 12 January, 05:21
    0
    4.2ms

    Explanation:

    Calibrated time = 0.3+0.2+0.5+1.4 = 2.4

    Measured time = 6.6ms

    Accuracy is closeness of measurement to an observed or true value

    Accuracy = 6.6-2.4 = 4.2ms
Know the Answer?
Not Sure About the Answer?
Find an answer to your question ✅ “Calibrations on a recent version of an operating system showed that on the client side, there is a delay of at least 0.5 ms for a packet to ...” in 📘 Engineering if you're in doubt about the correctness of the answers or there's no answer, then try to use the smart search and find answers to the similar questions.
Search for Other Answers