A thorough look at Live Text, and why it might need to connect to Apple’s servers. Could it be sending image identifiers or text extracted from your local images?
Live Text
Some suspect Apple of scanning images stored on your Mac. What evidence is there that could be happening?
Live Text appears straightforward OCR when you open an image containing recognisable text. What’s recorded in the log says that it doesn’t work like that, though.
Full details on which languages and apps are supported, and an explanation of how it works its magic.
Using Live Text to recognise kanji characters in a screenshot, then translating them from Japanese to English. Does it work?
How Apple and third-party developers can update the apps you use without your being aware, even if you have automatic updates disabled.
How Apple’s Neural Engine ensures that your data remains private, rather than uploading it for processing on servers. And it spares your CPU cores from running neural computation.
For some images, Visual Look Up fails so completely that it’s not even offered. Could this be exploited as a way of blocking image recognition?
Visual Look Up also recognises flowers, landmarks and pets, as well as well-known paintings. Here’s how it does those, and how Live Text is different.
You might have been using Visual Look Up for a few months now, or could still be unable to get it to work. How some features aren’t available everywhere, or on all supported Macs.