Last Week on My Mac: Fessing up

If you’re using Catalina or Big Sur, you should by now only be obtaining apps from four sources: those delivered by an Apple installer or update, any you get from the App Store, notarized apps from other online sources, and those you build yourself. We’re assured that those sources are all deemed ‘safe’.

Apple reviews App Store apps in detail, but there’s been a long succession of apps which have pulled tricks such as sending private data to their servers, presumably to be sold on to others. However carefully Apple might try to check apps during review, developers who are determined enough seem able to get away with it, at least until someone exposes what they’re doing.


One recent positive change has been the requirement to provide details of App Privacy, although that doesn’t come into effect until the next time that app is updated, so you’ll see few doing this just yet. It’s also entirely up to the developer how honest they are about their “privacy practices and handling of data”. That presumably means that we’re left to police each developer’s accuracy and compliance ourselves.

Notarized apps are even more difficult. As I’ll explain later this week, notarization seems primarily about early detection of malware. Among the few requirements of every notarized app is that it opts in to a hardened runtime, which is intended to provide robustness in the face of attack by malware. This is controversial, and in any case, although all notarized apps have to opt into this, they’re also allowed to opt out of most of its requirements.

Only this week, security researcher Csaba Fitzl @theevilbit uncovered what he considers to be a vulnerability in Streamlabs OBS, which is a consequence of it opting out of the full hardened runtime. That got me thinking how little information we provide users about security and privacy aspects of our apps, and whether we should all be more forthcoming. This didn’t go down well, as it seems accepted that even advanced users know almost nothing about the hardened runtime, which I fear is probably accurate, and true of many developers too.

It seems that we like things clear and simple. If it’s an App Store app, we know it runs in its sandbox and has been reviewed by Apple, so it must be secure and not raise any privacy concerns? If it has been notarized, it has a hardened runtime (whatever that might be) and has been checked by Apple for malware, so it must be secure and not compromise our privacy?

I think such faith is badly misplaced, as has been demonstrated so often by security researchers like Csaba Fitzl.

Many users, thanks to tools like Little Snitch, are now becoming much better-informed about some aspects of the security and privacy of the apps they use. Every so often, I get an email from someone who has discovered that SilentKnight, or another of my apps, appears to phone home, or at least to my GitHub server. I have actually documented that app’s online behaviours, including explaining its connection to my GitHub server, and why it does this, but it’s now common for us to ask questions rather than spend a little time checking in the documentation provided.

I’m beginning to wonder if I shouldn’t be adding clear statements on what security vulnerabilities each app might have, such as opt-outs to the hardened runtime, and what protection there is for your privacy. These are reasonable if not important questions for users to ask.

Does SilentKnight, for example, upload any information it has gathered about your Mac? Without going to the lengths of inspecting all its packet exchanges with the GitHub server, it’s very hard for a user to find out. Opening its source might allow some to verify that it doesn’t in fact upload any data at all, and that I don’t even know whether it’s a Mac which has asked for an excerpt from my databases (which you can also access and inspect if you wish). But for the great majority of users, open source is actually a closed book. Trying to grok someone else’s source code is often little easier than reversing it, another skill confined to a tiny proportion of users.

I’d like your opinion on this. Would a clearly worded explanation of an app’s security and privacy be of value to you? Or would it, like most other documentation, just be ignored? In the event that my promised account of the hardened runtime changes your views, you’re welcome to revise your opinion, of course.