A few days ago, there was a wave of panic following a report that macOS had scanned a QR code in an image and accessed the link encoded within it. A few hours later, it turned out the whole thing was a false alarm, but by then it was clear that many Mac users believe that Apple scans images on your Mac. This article looks at the reality behind that fear.
Much of the current climate of suspicion over Apple and images stems from what in retrospect was a misguided attempt to consult openly about changes Apple had intended to introduce a year ago in Monterey, to scan for images containing CSAM (Child Sexual Abuse Material). Like so many contentious issues there was widespread misunderstanding and misinterpretation of what Apple was proposing, and a lot of people came away with the belief that macOS would soon (if not already) be scanning images stored on their Mac in a bid to detect those containing CSAM.
That was never the case: Apple was proposing to check images being uploaded to iCloud using iPhotos or Photos, and as a result of the outcry, it announced that it would revise its proposals and consult again at a later date. As that hasn’t happened, the best assumption is that Apple hasn’t, for the time being at least, implemented that scheme, and there’s no evidence to suggest anything different.
Image analysis in Photos
photoanalysisd working through new and changed images in the user’s current Photos Library performing its recognition tasks. What’s unusual about this is that the service is suppressed when the Photos app is running, so one way to stop this analysis is to leave Photos open at all times.
Simple observation shows that
photoanalysisd is only active on images in Photos libraries, concentrates mainly on people, and in Ventura duplicate images, and its results are only available through Photos. There doesn’t appear to be any evidence of this information being released beyond that Mac, let alone provided to Apple; the latter appears highly improbable given the vast number of images currently stored in Photos libraries anyway.
Visual Look Up
This feature was introduced in Monterey, and is expanding in Ventura. Class coverage in Monterey includes breeds of cats and dogs, landmarks, flowers and plants, and paintings, to which Ventura adds common birds, insects and some statues. This is only performed for images chosen by the user for Visual Look Up, and can be disabled by unticking the Siri Suggestions item in the Search Results tab of the Spotlight pane.
As it’s well documented by log entries, it’s not difficult to figure out how it works. Image analysis occurs locally and results in an object identifier, akin to the ‘neural hash’ proposed for CSAM scanning, being looked up in a remote database. Unlike other forms of image recognition such as Google Images, the image itself remains local and isn’t sent to a server for analysis or recognition.
Spotlight image search
Ventura extends Spotlight search to include information from image analysis. Apple describes this as using “information from images in Photos, Messages, Notes and the Finder to enable searching by locations, scenes, or even things in the images, like text, a dog or a car.”
For this to work with images in the Finder and elsewhere, a background service like
photoanalysisd would need to crawl through the contents of images stored locally, generating object identifiers similar to those used in Visual Look Up, applying Live Text OCR to text discovered in images, and storing those in that volume’s metadata indexes. That process hasn’t been analysed yet, but is most unlikely to involve the transmission of images beyond the Mac.
If you don’t want Spotlight to index any of your documents, including images, then the Spotlight pane allows you exclude different types of search result, and to make specific folders private so they’re not indexed at all.
Far from being secretive about image analysis taking place in macOS, Apple has been strongly promoting features that rely on such analysis. It’s relatively easy to detect and intercept image and information upload using a software firewall, and most of these processes make copious log entries. Before fuelling suspicion, careful analysis should enable clear distinction to be made between features that can analyse local images, and those which give Apple access to any of your images. As far as I know, Apple operates only one scheme in which it gets copies of your own images, and that’s a voluntary scheme to improve image coverage in Maps, which I suspect you’re not even aware of.