Last Week on My Mac: Lies, notarization and privacy

I suppose I shouldn’t have been surprised that emails claiming to be from my internet provider turned out not to be phishing but yet another lie. When their subject promised “email security is being improved in a weeks time” what it actually meant was that the provider was going to block all access using POP/SMTP and IMAP, and force me to use Office 365 on the web. And have the gall to tell me how much better that will be.

Why do so many – politicians, social media moguls, huge businesses – lie through their teeth in this way?

Its effect is corrosive, and leads us to doubt everything and everyone, even Apple. For the week after WWDC has once again been a time when anxious Mac users have been asking whether the next version of macOS will still allow apps that don’t come from the App Store and haven’t been notarized. Yet others have been telling me how Apple is moving inexorably to requiring all apps to be both sandboxed and notarized.

While no one at WWDC mentioned anything about these, could Apple be about to spring a surprise? After all, it hasn’t yet published its annual update to the Platform Security Guide, so reading between the lines, app security could well be set for a shocking change. Apple is then going to tell us how these changes are all to our benefit, just as my internet provider writes that losing useful access to my email service is an improvement in security.

Having watched my way through several of the security and privacy sessions at this year’s WWDC, I can confirm that Apple hasn’t announced any tightening of requirements for notarization or sandboxing. These currently amount to:

  • macOS requires that executables are signed (at least on Apple silicon Macs), but that can be ad hoc, and doesn’t require an Apple signing certificate.
  • Apps and other executable code distributed by developers must be fully signed and notarized, but normally there’s no requirement for sandboxing.
  • Apps and other executable code distributed through the App Store follow the Store’s rules for sandboxing (required) and signing (by Apple).
  • Hobbyists and others who aren’t developers aren’t required to sign with Apple certificates, or to notarize. However, those apps and other executable code should only be distributed personally, not online, and those who use them should be made fully aware of their vulnerabilities.
  • Distributing source code for others to compile and build is fine, provided everyone is aware of the risks involved.

My next step was to test this using one of my own apps. There are some catches for the unwary here, now that macOS relies on several different forms of identification for executable code, including the app’s bundle identifier (e.g. co.eclecticlight.myapp) and code directory hashes (in the signature). My test app is based on my own DelightEd, with a different identifier, and substantial change in its code to ensure different cdhashes.

I first created an unsandboxed, un-notarized, unhardened test app with a new bundle identifier, and signed it using my developer certificate. This is how we used to build apps for distribution before the introduction of notarization.

I then made a copy of that signed app, stripped its signature using codesign, and re-signed it using an ad hoc signature. This is how non-developers used to build apps before the introduction of notarisation, with the addition of ad hoc signing to satisfy the requirements of Apple silicon.

Both apps were copied over AirDrop to a test Mac running a beta; using AirDrop ensured that they were quarantined. The apps were then moved into the Applications folder, and the ad hoc signed app was tested first.

Both apps opened as expected, on the second user request using the Finder’s Open command, to ensure the user is fully aware that they aren’t notarized, just as in Ventura. To assess whether this had any effect on their privacy handling, each was then used to open and save files to protected folders including Documents and Downloads. Neither of those resulted in any privacy alerts, nor were the apps added to Privacy & Security Settings.

For anyone puzzled as to how an unsandboxed, un-notarized, unhardened and ad hoc signed app can read and write to protected folders without a flurry of dialogs, we have to go back to Apple’s principles on privacy, that the user remains in control and gives their consent. TCC stands for Transparency, Consent and Control, after all.

When an app uses the standard Open and Save dialogs, the user is in full control, and by clicking the Open or Save button actively consents to that action. This normally won’t require any additional privacy settings, nor trigger consent dialogs. For sandboxed apps, there’s a pair of entitlements com.apple.security.files.user-selected.read-only and com.apple.security.files.user-selected.read-write to cover this situation, but apps that don’t run in a sandbox don’t require those. macOS tries to strike the right balance between privacy and practicality.

Until Apple tells us otherwise, I think it’s clear that nothing is changing significantly in sandboxing and notarization that would prevent hobbyists and others who aren’t developers from continuing what they do currently, nor should this hinder the distribution and use of source code. For those who are developers, notarization is more than a formality: it’s recognition of the fact that the last thing any of us would wish on others is for their Macs to be compromised or exploited.