216k views
2 votes
Consider the behavior of two machines in a distributed system. Both have clocks that are supposed to tick 1000 times per millisecond. One of them actually does, but the other ticks only 990 times per millisecond. If UTC updates come in once a minute, what is the maximum clock skew that will occur?

User Kulls
by
8.4k points

1 Answer

7 votes

Final answer:

The maximum clock skew between two machines in a distributed system, one ticking at 1000 times/millisecond and the other at 990 times/millisecond with UTC updates once a minute, would be 36 seconds just before resynchronization.

Step-by-step explanation:

When considering clocks in a distributed system where one clock ticks at 1000 times per millisecond and another ticks at 990 times per millisecond, the maximum clock skew that will occur is determined by the difference in tick rates and the update interval.

In this case, the update interval is once a minute, or 60,000 milliseconds. The second machine is 10 ticks slow per millisecond, which amounts to 10 * 60,000 = 600,000 ticks in a minute.

Since the first machine ticks 1000 times per millisecond, or 1,000,000 ticks per minute, we calculate the skew as the fraction of a minute:

Maximum clock skew = 600,000 ticks / 1,000,000 ticks per minute = 0.6 minutes or 36 seconds.

Therefore, the maximum clock skew just before the clocks receive the UTC update and presumably resynchronize would be 36 seconds.

User Dirk Deyne
by
7.4k points