So macOS is being swallowed up into iOS? Haven’t you forgotten how iPadOS is trying to establish itself the middle ground between them?
Live Text appears straightforward OCR when you open an image containing recognisable text. What’s recorded in the log says that it doesn’t work like that, though.
Full details on which languages and apps are supported, and an explanation of how it works its magic.
Requirements, which apps support it, how to access it in each of those, why some images don’t work, and how to troubleshoot it.
How Apple and third-party developers can update the apps you use without your being aware, even if you have automatic updates disabled.
How can you tell when software uses the Neural Engine in an M1 series Mac? How much power does it use, and what is Espresso? Mysteries unravelled.
How Apple’s Neural Engine ensures that your data remains private, rather than uploading it for processing on servers. And it spares your CPU cores from running neural computation.
For some images, Visual Look Up fails so completely that it’s not even offered. Could this be exploited as a way of blocking image recognition?
Visual Look Up also recognises flowers, landmarks and pets, as well as well-known paintings. Here’s how it does those, and how Live Text is different.
The first phase analyses, classifies and detects any objects within the image. When the user clicks on the white dot, this completes with a search for the best match.