Calibrations on a recent version of an operating system showed that on the client side, there is a delay of at least 0.5 ms for a packet to get from an application to the network interface and a delay of 1.4 ms for the opposite path (network interface to application buffer). The corresponding minimum delays for the server are 0.20 ms and 0.30 ms, respectively.
What would be the accuracy of a run of the Cristian's algorithm between a client and server, both running this version of Linux, if the round trip time measured at the client is 6.6 ms?