Live Text appears straightforward OCR when you open an image containing recognisable text. What’s recorded in the log says that it doesn’t work like that, though.
Full details on which languages and apps are supported, and an explanation of how it works its magic.
Using Live Text to recognise kanji characters in a screenshot, then translating them from Japanese to English. Does it work?
How Apple and third-party developers can update the apps you use without your being aware, even if you have automatic updates disabled.
How Apple’s Neural Engine ensures that your data remains private, rather than uploading it for processing on servers. And it spares your CPU cores from running neural computation.
For some images, Visual Look Up fails so completely that it’s not even offered. Could this be exploited as a way of blocking image recognition?
Visual Look Up also recognises flowers, landmarks and pets, as well as well-known paintings. Here’s how it does those, and how Live Text is different.
You might have been using Visual Look Up for a few months now, or could still be unable to get it to work. How some features aren’t available everywhere, or on all supported Macs.
This should be one of the most transformative features we’ve seen in a recent version of macOS. Its OCR is excellent, and uses recognised text to link to knowledge.
When you upgrade your Mac to Monterey, just which of its new features do you expect it to support? All is made clear in this table.