With Apple’s annual WWDC just a week away, all speculation now turns to what it will announce there, which new chips we’ll be tempted by, and what macOS 13 will offer. As I have no inside information, I’ll offer a few thoughts based on what we do know.
Unless Apple surprises all its engineers too, it will announce macOS 13 to be released in the autumn/fall. As with last year’s announcement of Monterey, I suspect that this won’t bring the architectural changes of previous major releases, but more substantial development of features such as those I’ll discuss later.
More importantly, it means that many of us will spend the coming summer wrestling with its beta-releases. Even if these aren’t radical departures, this is a good time to remind ourselves that testing betas of macOS on production systems is looking for trouble. If you can’t afford to lose that Mac and everything on it for hours or even days, then please don’t let a beta near it.
macOS 13 also means that Apple’s support calendar is about to roll over. If your Mac is still running Catalina, you’re about to lose all further security updates. If it can be upgraded to Big Sur or Monterey, now’s the time to plan that, before it becomes completely unsupported.
Monterey 12.4 is, in my experience, a big leap forward from Catalina. New features like the SSV and Universal Control are major benefits, and its only noticeable downside is the still-unfixed Finder Find feature memory leak, which can be worked around. If you’re still hesitant, then upgrading to Monterey this summer will put you in a better position to decide whether to head on up to macOS 13 later.
The autumn/fall will also inevitably see the introduction of new Apple Silicon chips, I suspect in the M2 series, given that we’ve already been told the M1 family is complete. Unless the supply situation improves dramatically, you may still do better ordering an M1 Mac. The bigger leap is from Intel to M1; how much further you’ll be able to go this year with a successor is uncertain.
While Apple could roll out its high-end successors to the M1 family this year, it’s more likely that introduction will proceed in a similar sequence to the M1. Yields and early problems are likely to be best with the simpler variants in the new family. I think you’ll be very lucky to see a replacement for the M1 Ultra for a good while yet, but the basic M2 and its Pro sibling look more likely in the next year.
Watch the ANE
To understand where greatest changes are likely to occur, I go back five years to Apple’s replacement of the A10 with the A11. Not only did this bring simultaneous use of E and P cores, a major milestone in the evolution of Apple’s ARM chips, but it introduced the first neural engine or ANE. Although a pale shadow of the ANE incorporated into all Apple’s current chips, it saw immediate use in Face ID, Animoji, and some lesser tasks.
In the nearly five years since, the ANE has remained something of a mystery. One of the interesting features of the
powermetrics command tool is that it provides the power consumption by the ANE. Each time that I run
powermetrics, I glance at that figure, and almost without exception it has remained 0 mW. The only glimmer of activity that I have seen is when using Visual Look Up. I’m sure that if I stayed up into the small hours, when the services supporting Photos image recognition seem most active, I’d catch the ANE working its rocks off, but at that time of day my M1 Mac is welcome to all the cores it wants for such tasks.
When designing chips like the M1, silicon is a costly investment. You don’t add a novel system like the ANE on a whim. Its presence in iPhones, iPads and M1 Macs means that Apple intends doing a great deal with it: it has to earn its keep and pay its rent on the chip.
Like the M1’s GPU, the ANE isn’t hardware that developers can access directly. There are two main routes: Apple’s tools for Machine Learning (ML), which are good for developing apps that learn to classify or analyse images, for instance, and some features of its extensive Accelerate computational libraries, although they’re black boxes so we don’t know which.
Other than running neural networks for ML, there’s a great deal more that the ANE is capable of. Whether it has a role in future developments in Augmented Reality, or more extensive ML features, we can only speculate. But if I had to put my money anywhere, it would be on the ANE working harder in the coming months and years, to our advantage. That’s what I’m going to be watching for at WWDC this year.