The purpose of notarization of apps and other executable code is to give reasonable assurance to users that what they’ve downloaded from a third-party site isn’t malicious. In Apple’s words, notarization “proves that Apple has been provided a copy of the code to check for malware and no known malware was found.”
When you first open that app, plug-in or code after you’ve downloaded it, it normally has a quarantine flag set, as a result of which macOS conducts a full first run check on it. When it’s discovered that the app is notarized, either by reading the ticket which is stapled to it, or by looking that up with Apple, you’re informed that Apple has checked it for malicious software, and that none was found. You’re then able to continue to open it.
There’s much more to notarization than simply submitting an app for Apple to check for malicious code. For all new notarizations, the following are also required:
- all executable code is correctly signed using a Developer ID application or similar certificate, using a secure timestamp;
- the ‘hardened runtime’ is enabled, which limits the app’s behaviour and should prevent the app from being abused by malicious code;
- the app doesn’t have the entitlement set to enable debugging (in Xcode);
- the app has been built with the macOS 10.9 or later SDK;
- its entitlements are properly formatted.
Taken together, these ensure that all the code which an app can run has been properly signed at the time that it was built, and hasn’t been tampered with since, which should be reassuring to the user.
Apple’s malware check
Inevitably, Apple hasn’t disclosed any details of the checks that it performs before it issues a notarization ticket. These will include checks against the signatures of known malware, but don’t appear to go much further than that. On a good day, getting an app notarized takes around two minutes from when its delivery to Apple is complete and its ticket is available to download and staple to the app. This doesn’t allow time for humans to examine anything which the automatic checks consider to be dubious.
As you’ve no doubt heard, Apple’s checks have already been fooled by a couple of examples of malware (at least), so we know that this isn’t perfect. It does, though, give Apple the distinct advantage that it has a sample of every app which has been submitted for notarization, providing a valuable start in the event that any does turn out to be malicious.
Not all apps which have been notarized meet the strict requirements given above. One class of exemptions is that of legacy software. Apple has encouraged all developers to submit existing apps and other executables even if they don’t meet the criteria for recent apps. It’s unknown how many apps this applies to, but it’s largely confined to products built before 1 June 2019, when notarization first became a requirement for macOS Mojave.
There’s another even murkier category of legacy software which hasn’t been notarized by its developer, but by a third party. Apple stresses that code signing and notarization are separate processes, and can be performed by different actors. While only the developer should sign an app, anyone can submit apps for notarization. Several developers have been surprised to discover that old versions of their software have now been submitted to Apple, checked for malware, and subsequently notarized. This is possible because the ticket resulting from successful notarization doesn’t have to be stapled to the software, and may only become apparent during first run checks on the app.
It’s not known who has been quietly notarizing large numbers of old apps, but some have speculated that this may have been performed for Apple.
So although recent and current software which is notarized should invariably comply with the requirements listed above, that isn’t the case for software which was signed earlier, which may have been granted legacy notarization without being signed in depth or using a hardened runtime.
How is the runtime hardened?
The value and user benefits of most of the requirements for notarization are generally clear. Requiring the app to be signed in depth means that all its executable components can have their integrity checked using their cdhash, included with the signature. The secure timestamp ensures that the code can’t have been modified and those changes backdated so that they’re somehow covered by that signature. But what about the ‘hardened runtime’, what does that do?
When a ‘hardened’ app is run, there are many things it can’t do. For example, it can’t modify its own code in memory, a technique which some old apps might use, and which may also used by malware. It includes many of the privacy controls introduced successively from macOS Mojave: if a hardened app tries to access protected data without being entitled to do so, that will be refused, and the app is normally crashed immediately.
For the user, hardened apps should be really good news, as they should run in a more secure environment and respect the user’s privacy. That is, unless that app opts out, which is the subject of another article.