Over the last 14 years, we became accustomed to all Macs being based on Intel processors and chipsets, which encourages bad habits and unrealistic expectations. One of those has been that the internal clock which keeps Mach Absolute Time ticks once every nanosecond. It thus provides a high precision way of measuring time which is also simple to read and manipulate. In the M1 Macs, time has changed, and now ticks every 41.67 nanoseconds, which makes a big difference.
Officially, any software which accesses Mach Absolute Time (MAT), regardless of the hardware it runs on, should check the Timebase, from which it gets a numerator and denominator which is then used to convert from MAT to (nano)seconds. For Intel Macs, the numerator and denominator are fixed at 1, so there’s actually no conversion to be done to get the time in nanoseconds. On an M1 Mac, the numerator is 125, and denominator 3, meaning the Mach Absolute Time increments one tick every 41.67 nanoseconds.
If you measure a time difference of 1000 ticks, on an Intel Mac that represents one microsecond, but on an M1 Mac it’s really 41.67 microseconds, a difference which should be glaringly obvious.
To minimise the bugs which this could reveal, when an app is running in Intel mode, using Rosetta 2, its timebase reverts to that for Intel systems, at one tick per nanosecond. This should ensure that apps which don’t use the timebase as they should can still make sense of MAT values. This doesn’t solve all the problems which can arise, though, and you’ll still come across software which reports strange times, behaves oddly or is plain wrong.
You can test this on your M1 Mac using my free utility Mints, which has a convenient button to display Mach timebase information. Run Mints in native mode, the default as it’s a Universal App, and it reports the new ARM values; force it to launch as an app translated by Rosetta, and it will give the same timebase as an Intel Mac.
If you have any apps or code which use Mach Absolute Time, you’ll need to review them carefully to see if this affects them.
One place where you’ll encounter this problem with the changed timebase is in the Unified log.
Let’s say that you use the
log show command to dump log extracts to files, and need access to their
machTimestamp field for high-precision timing. Do that on an Apple Silicon Mac, and those values are no longer nanoseconds at all. To convert them to nanoseconds, you have to multiply by the scaling factor, which is 1 on an Intel Mac but 125/3 on an M1 model. As far as I’m aware, that scaling factor isn’t saved in the log file, nor are details of the architecture on which those logs were written.
If you’re accustomed to browsing Intel Mac logs with the aid of the high resolution of the
machTimestamp field, then M1 Macs might be a bit of a disappointment in this respect. However, even when the log gets hectic, you’re extremely unlikely to encounter any collision between entries with the same
machTimestamp, and compared with the normal date and time field, this is still high resolution.
There has been one other change in the log recently: in Big Sur, there’s a new field
bootUUID which appears in every log entry. This seems to duplicate or replace what was previously written to the
com.apple.uuiddb.boot-uuid extended attribute of some log files. It contains the same UUID for every entry, which is that assigned at the last system startup, and matches that reported in System Information, under Hardware > Controller > Boot UUID. I will be looking more carefully at this later, but as it only changes at startup, I don’t intend offering it as an additional field for display in Ulbow and Consolation, my free log browsers.
Sadly, for those hoping that Apple would finally make its bundled Console app more useful, and able to browse recent log entries without creating a logarchive first, Console in Big Sur remains incapable. There don’t appear to be any significant changes in the
log command either, and its man page remains dated 10 May 2016.
What we should have been doing when measuring time intervals using Mach Absolute Time, was applying a conversion factor to the difference in clock ticks, to convert from ticks to nanoseconds. In Swift, something like:
var info = mach_timebase_info()
((secondTicks - firstTicks) * UInt64(info.numer)) / UInt64(info.denom)
returns a time interval in nanoseconds.
Some ideas which may work for languages other than Swift and Objective-C are given in this handy cross-platform compilation.