So many times: the clocks in your Mac

No matter where you are on the earth, although your clock may show a different time, it advances in perfect step with all the other clocks on the planet. Einstein and relativity may have introduced devious, mind-boggling prospects such as non-linear time and even time travel, but for the here and now, the march of time is one of our few universals.

Except, that is, to computers, including Macs and iOS devices. Look carefully enough in your Mac and you can find three separate clocks, which behave quite differently. This is how some timestamps – such as those in Sierra’s new logs – can appear to change each time that you look at them. This article explains how these clocks work with respect to one another, and how to interpret apparent glitches in time.

External reference time

External reference time, or co-ordinated universal time (UTC), is a global, uniform time standard which is promulgated and propagated across several different types of service. Traditionally, computers have obtained UTC references periodically from time servers using the Network Time Protocol (NTP). I have explained how NTP works, and how it is used to correct the clock of your Mac, in this article.

More recently, other sources of very accurate UTC references have become available, including GPS, whose satellites broadcast time signals to enable precise locations to be calculated by any device that can receive them, and mobile phone networks. Those may progressively replace the older NTP system, particularly in mobile devices.

Local reference time

Macs maintain their own local estimate of this reference time, which is what forms the basis for the standard menubar clock. It should keep close to external reference time, but because they are compared relatively infrequently, when local time is re-aligned to UTC the local clock can suddenly change, or behave non-linearly for a period.

Local reference time is accessed in terms of the date, hours, minutes, and seconds, and should advance at a rate of 1 local second for every second UTC, except during realignment. This is the standard time system which is used by the vast majority of apps, including macOS itself for setting timestamps on files and many events.

You can also change local reference time, in the Date & Time pane. This has been used, for example, to fool expired software into thinking that it is still valid, but is very dangerous. Timestamps written when the local reference time is incorrect will remain forever incorrect. Worse, network communications will use incorrect times, and that can make some services fail. A lot of what we do now assumes that local reference time is very close to UTC.

Local reference time is most completely expressed with the accompanying time zone, the adjustment which is applied to express your Mac’s approximation of UTC into local time. A further adjustment is often necessary to allow for seasonal settings such as Summer or Daylight Savings Time before arriving at the figures displayed in the menu bar.

Adjustments for time zone and season often catch users and software developers out, particularly when Macs (and worse, iOS devices) move from one timezone to another, and when seasonal changes are made to local time. These can be bizarre: some users reported in 2016 that after one seasonal change, Calendar displayed correct local times but printed out incorrect ones. There is little that a user can do to work around such bugs.

Hardware ticks

Processors and their buses rely on precise, high-resolution clocks which do not keep time as such, but count ticks. The frequency of these ticks is hardware-dependent: clearly, on an original 1998 iMac with a processor running at 233 MHz they are of much lower frequency than on a current MacBook Pro running at its maximum processor ‘speed’ of 3.8 GHz. To allow for this, macOS converts these raw ticks, or Mach absolute time, into nanoseconds.

Hardware ticks are most useful for measuring precise intervals in time, and form the basis of high-speed timestamping mechanisms, such as those for Sierra’s new log entries, because minimal processing is required. However, they do not convert directly into local real time.

To estimate a local reference time for an event which took place in the past and was logged using ticks, your Mac will normally calculate the difference in tick count between the event and now, then using the local reference time now as its base, work out the time of the event from that.

If local reference time drifts slightly with respect to hardware ticks – which it does – then the calculated time of an historic event will change slightly with that drift. Perform the same calculation now, and in an hour’s time, and there will inevitably be a difference in the two estimates of the time of that event. However, differences should remain small, and are confined to events timed on the basis of hardware ticks.

The other snag with hardware ticks is that they stop when a system is suspended in sleep or shut down. macOS makes allowance for this so that, for example, the tick counts stored in Sierra’s log do not lose chunks of time.

Software Timers

macOS (and iOS, et al.) provide apps with a software timer feature, which can be used to run periodic events such as updating a window. These are based on hardware ticks, so can be run quite frequently, and can resolve to around 50-100 milliseconds. However, they only guarantee a minimum time interval, and events which are driven by software timers will be delayed by sleep or ‘App Nap’ when in the background, by other processes, or by a range of other factors.

Any apps which use software timers to try to measure precise times are likely to misbehave, and should be avoided: that is not their purpose.

Units of time

Time systems which are based on hardware ticks normally work in 64-bit integer values of nanoseconds, while time intervals are specified in double-length floating point values of seconds. The former allows for a precision of 1 nanosecond over a range of around 292 years (ignoring negative values). Time intervals have a precision of better than 1 millisecond over a range of about 10,000 years.

Many systems using tick counts standardise against the start of the Unix epoch, which sets 0 at midnight at the start of 1 January 1970. Using just positive integers, those tick counts should not roll over until the year 2262. Until someone devises a means of travelling through time, we will have no concerns about current clocks and timers having to face that rollover.