If there’s one thing I’m confident we can all agree on, it’s that none of us wants malicious software on our Mac. How we go about achieving that will spark debate, but once again I’m sure we can agree that we don’t want such malware to run in the first place, no matter how good other defences against it might be.
Because there’s no one perfect defensive layer that can block malware from running but allow wanted software to run without hindrance, effective defences are built in layers, like an onion. One common requirement to these layers is being able to distinguish what is known to be good, so trusted, and what is at the best suspicious, which mustn’t be run until trust has been established.
One proven and popular way of establishing trust in software is checking its integrity using a combination of hashes and certificates. When a developer builds an app or any presentation of executable code, the hashes of what they have built are stored in that product, and validated using a security certificate, traceable through a chain of trust to a Certificate Authority. App and other code signatures do that and more, including establishing entitlements to allow the code to operate beyond what’s imposed by a sandbox or other limits. But they all need to establish the identity of the developer, and that what’s in the app or code hasn’t been tampered with.
Although code signatures and hashes are one layer, they’re also far from complete in terms of protection. Malicious actors can sign code too, and it takes time to identify the certificates they’ve used and render them invalid by revoking them. One more proactive approach is to check signed software to see whether it appears malicious; those apps deemed to be free of malware then have hashes recorded, and a traceable ticket issued to confirm those hashes. That’s essentially what happens in Apple’s notarization service.
Like code signing, notarization can be abused, but for a malicious actor this is a greater challenge, and forces them to conceal their code using techniques such as steganography. That’s significant effort when they have to accept that it won’t be long before it’s detected, and their notarization tickets and signatures are revoked. This all raises the bar for those trying to ruin our day by sneaking their malware onto our Mac. There are other layers too: XProtect ‘Yara’ scans for signatures of known malicious code, the new XProtect ‘Remediator’ to look for evidence of malware activity and remediate it, and more.
As a developer, I dislike code signing and notarization. As I started building commercial software for Macs back in 1989, I well remember years of productive work without ever having to sign anything, let alone submit it to Apple. I also remember some of the first Mac malware, including the expensive monthly magazine that distributed a virus to its readers, on the cover disc. Mac malware was then quite spectacular, unusual and sometimes even esoteric.
Times have changed. Gone are cover discs, indeed almost all the magazines have since vanished. Slow Mac-to-Mac malware transmission has been replaced by the power and speed of spread over the internet, and going viral in more ways than one. Malware developers now grow rich on their criminal proceeds, and flourish almost undisturbed by law enforcement. Software developers can’t ignore malware, and have an obligation to play their part in trying to help users protect themselves. If you’re going to offer software to the public, the very least you can do is ensure that it supports macOS security protection.
Those selling products, particularly security protection, and those who advise Mac users have an even greater obligation to their clients and customers as users. No matter how good you think your product, service or advice might be, encouraging the naive and credulous to bypass or disable Apple’s layers of security protection is not in their best interests. By all means add layers of protection for them, but never claim that Apple’s are ineffective, dispensable or can be replaced by something else. Whatever you might believe, or want to profit from, never encourage a user to take unwitting or unnecessary risks.
For the hobbyist developer, this comes hard. If your software comes as open source for the user to build themselves, you don’t have anything to do with signing or notarization, and your users don’t either.* But the moment you distribute a built product, particularly presented in an Installer package, you can’t shirk your responsibility to the user and expect them to bypass macOS security checks. There are ways around this, such as collaboration with someone who’s prepared to sign and notarize your software with you.
Apple introduced notarization in June 2018, well over four years ago. I notarized my first (free) app here at the end of July, and over the next ten months notarized over 30 different apps on about a hundred occasions. By the following June I was notarizing the (free) command tools that I provide, including
silnite, which has been downloaded more than 100,000 times. These processes aren’t always easy, and I well understand how some complex products become difficult. But it’s our obligation to help our users to avoid running malware on their Mac. Telling them to bypass macOS security if they want to run our software really isn’t being responsible.
* In fact, all executable code is now signed when being built, because of the requirement imposed by Apple silicon Macs, which can’t and won’t run unsigned Arm64 code. However, unless you explicitly sign using an Apple-issued certificate, the signature will be ad hoc, thus lack traceability through a chain of trust. It is thus no substitute for a proper signature, and not accepted for notarization.