How Apple should help Ukraine by fulfilling its promises

As it looks increasingly likely that the war in Ukraine is going to drag human suffering on for months rather than just a few days, so it becomes more likely that those who continue to resist there and elsewhere need stronger protection against state security monitoring.

Although Apple has made a big thing of protecting privacy, macOS has one hole which Apple acknowledged over a year ago, promised to fix, and hasn’t fulfilled its promises. Every time you open an app, macOS checks the validity of its developer’s signing certificate. If that certificate hasn’t been checked recently with Apple, your Mac connects to Apple’s servers and checks it with them, an action which could reveal information to an eavesdropper. This became public knowledge in November 2020, and Apple was driven to undertake that it would address this within a year, but doesn’t appear to have done so, at least not as far as Monterey 12.2.1 is concerned.

Over a year ago, this affected relatively few Mac users, such as investigative journalists operating under repressive regimes. There are workarounds which are used by those able to prepare their Macs for work in such circumstances. With the growing likelihood that large numbers of Ukrainians, and quite possibly Russians, will soon be forced to live under repression, it’s time for Apple to meet its promises, protect their privacy, and explain just what it has done.

What used to happen

This came to public attention because, when Apple released Big Sur 11.0.1 on 12 November 2020, it encountered server problems. Among the servers affected were those at ocsp.apple.com, which prevented many users from launching apps. Investigation by Jeff Johnson and others revealed that, at that time, online checks of developer certificates used to sign apps were conducted over plain HTTP connections and were only cached for 5 minutes.

As a result, almost every time that you opened a signed app on your Mac, it underwent an online certificate check over an unencrypted connection to Apple’s OCSP servers, which contained information an eavesdropper could use to identify which app you had just opened.

These revelations caused a storm, and Apple responded quickly, promising change.

What Apple promised

Apple stated that “these security checks have never included the user’s Apple ID or the identity of their device”, and that it had stopped logging users’ IP addresses. Furthermore, it promised that certificate revocation checks will change in the following year to feature:

  • “a new encrypted protocol”;
  • “strong protections against” [OCSP] “server failure”;
  • “a new preference for users to opt out of these security protections”.

Apple embodied these in a Support document which was first published in November 2020. It has since been updated, and now claims that to have been published on 30 April 2021, which is misleading as that’s merely a revision of the original from five months earlier. However, those three promises remain.

What seems to happen now

For some months, I have been asking whether Apple has fulfilled those promises made well over a year ago. As Apple hasn’t seen fit to update its Support Note, and users still have no preference to opt out of these security protections, it’s easy to assume that nothing has been done at all. However, careful testing in Monterey 12.2.1 reveals some signs of change.

First, Apple immediately extended the period of caching from 5 minutes to 12 hours, as discovered by Jeff Johnson. Although that might appear to reduce the problem greatly, it still means daily checks in practice. Judging by reports of problems encountered by users, Apple does appear to have made the service more robust, but that’s hard for a user to test.

In addition to those, it appears that OCSP checks run today are performed using TLS/HTTPS rather than plain HTTP connections, with servers at ocsp2.apple.com. That’s true for both Monterey and Big Sur, but I don’t know whether this also applies to earlier versions of macOS.

The third promise, to provide a new preference to allow users to opt out of OCSP checks altogether, clearly hasn’t been implemented at all. If a user wishes to do that, they’re still forced to configure a software firewall to block all outgoing connections to ocsp.apple.com and ocsp2.apple.com, or to strip signatures from apps they don’t want checked in this way. I give detailed advice in this article.

What Apple should do

Above all else, Apple now needs to explain properly to users, particularly those in Ukraine and other nations which are dangerous places to use a Mac, exactly how it protects code signature checks from eavesdropping. Which versions of macOS provide checks using robust protection? What is that protection?

Apple also now needs to deliver on its promises of over a year ago. It’s all very well for those of us whose greatest worries are Covid and whether this barbaric war will disturb our comfortable lives. For those in Ukraine, maintaining the privacy of what users do on their Macs could make the difference between life and death.

Postscript

I’m very grateful to Jeff Johnson @lapcatsoftware for testing this on Big Sur; I’ve incorporated his results above.

One important point worth emphasising is that, if you’ve configured a Hosts file or firewall to block outgoing requests to ocsp.apple.com, and you want to block Apple’s new checks, you’ll need to set your block on ocsp2.apple.com over port 443. You can find a full reference to these and other servers here. I’ve updated my linked article about how to use apps privately with this information.

Tests here on Monterey 12.2.1 suggest that such blocks are effective and have little penalty. Although trustd sends a volley of requests, possibly as many as 50, these appear to be made asynchronously now, so they don’t delay app launch. However, older versions of macOS may not take as kindly, and this could produce significant delays in launching an app whose certificate isn’t in the cache any more. I also suspect that Apple may have quietly extended cache life beyond 12 hours, although that’s harder to test. Of course Apple could make all this far clearer and safer for everyone concerned and tell us what it has done.