Mints now tells you the (Mach absolute) time

Following my exploration of Mach absolute time last week, I’ve been unable to find any GUI or command tool which can tell you the scaling factor required to convert raw values of Mach absolute time to nanoseconds. This isn’t perhaps surprising, as over the last decade or so it has been boringly stuck at 1. With that situation changing on Apple Silicon systems, it suddenly becomes important.

Let’s say that you use the log show command to dump log extracts to files, and need access to their machTimestamp field for high-precision timing. Do that on an Apple Silicon Mac, and those values are no longer nanoseconds at all. To convert them to nanoseconds, you have to multiply by a scaling factor which only seems readily accessible from compiled code.*

This new version of my collection of tools, Mints, now has an extra button which opens a window in which the current timebase values are displayed, together with both a raw Mach Absolute Time value, and that value scaled to nanoseconds. Run this on an Intel Mac, and it’s predictably boring:
Running as Intel code:
Timebase numerator = 1
denominator = 1
factor = 1.0
Mach Absolute Time (raw) = 2812573112130686
Mach Absolute Time (corr) = 2812573112130686

If you have access to a DTK, though, you might be very surprised indeed. There’s another trick you can try with a DTK which you might find interesting. Quit Mints, then in the Finder, open the Mints app’s Get Info dialog and enable the Open using Rosetta option, then check Mach Absolute Time again in Mints to see how different the factors and time become.

Being mindful that there are other users who need access to Mach timebase information, for example from the command line, I’ll be making a pair of tiny command tools which will generate the same information in a form which is accessible to scripts and other apps.

Mints version 1.0b6 is now available from here: mints10b6
from Updates above, from its Product Page, and via its auto-update mechanism. It is, of course, a Universal App.

* Here’s a disturbing thought for those involved in forensics and other situations where they’ll analyse logs from another Mac. The machTimestamp field in the unified log gives raw Mach absolute time, which isn’t scaled to nanoseconds. There’s nothing in the log which tells you the required scaling factor, and even working out whether it has come from an Intel or Apple Silicon model may be difficult. How can you tell whether you need to apply a scaling factor to machTimestamp, and what factor should you apply, particularly if different Apple Silicon models use different timebases?