A thorough look at Live Text, and why it might need to connect to Apple’s servers. Could it be sending image identifiers or text extracted from your local images?
Visual Look Up
Can it be true that Apple is sent information about every image we browse in the Finder? Analysis of Visual Look Up and Live Text and when it happens.
Some suspect Apple of scanning images stored on your Mac. What evidence is there that could be happening?
So macOS is being swallowed up into iOS? Haven’t you forgotten how iPadOS is trying to establish itself the middle ground between them?
Live Text appears straightforward OCR when you open an image containing recognisable text. What’s recorded in the log says that it doesn’t work like that, though.
Full details on which languages and apps are supported, and an explanation of how it works its magic.
Requirements, which apps support it, how to access it in each of those, why some images don’t work, and how to troubleshoot it.
How Apple and third-party developers can update the apps you use without your being aware, even if you have automatic updates disabled.
How can you tell when software uses the Neural Engine in an M1 series Mac? How much power does it use, and what is Espresso? Mysteries unravelled.
How Apple’s Neural Engine ensures that your data remains private, rather than uploading it for processing on servers. And it spares your CPU cores from running neural computation.