Mac Format

MATT BOLTON…

IS WORRIED THAT APPLE’S NEURAL CHIPS ARE BRINGING BACK SOFTWARE CLASS DIVISIONS WE’D NEARLY LEFT BEHIND

-

ne of the benefits of Apple producing its own chips, and of a general plateauing of Intel chip improvemen­ts in Macs even before the Apple switch, over the last few years has been that we didn’t get much of a ‘left behind’ syndrome when new OS updates were released. In the days of OS X, it wasn’t uncommon for some of the new features of an update to be locked away if your Mac was too old. But we got to a point where that wasn’t so much of an issue, and effectivel­y any new feature coming to Mac or iPhone especially would be available on anything the software runs on.

The ‘left behind’ syndrome didn’t totally go away (on iPad, it really reared its head recently with Stage Manager, which only works on specific machines with heftier processors), but for most of us, the fear of our devices being missed off the list of cool upgrades is rare now. But it’s growing again in 2023, and it seems to come down to the rise of features based on machine-learning techniques that require Neural chips. There are two obvious examples from this year of new ‘features’ that are all about software, but aren’t available on older models because their processors aren’t up to it.

The first is the Adaptive Audio features that just arrived on AirPods Pro 2. These create a new kind of Transparen­cy mode that more selectivel­y blends the outside world with what you’re hearing, and can automatica­lly switch modes when you talk to someone, so you can hear the response. Evidently, it requires the power of the H2 chip, which is only used in the second-gen AirPods Pro right now. That’s a big shame for owners of the first-gen AirPods Pro, but it’s even more frustratin­g for anyone who recently bought AirPods Max – Apple’s most expensive and premium headphones, still on sale right now, and lacking support for these latest features.

The second is the new ‘double tap’ gesture on the Apple Watch Series 9, which is detected using heart and motion sensors that are the same as on Apple Watch Series 8 (and 7), but seem to need

Othe new Neural chip to accurately interpret them… even though a version of this feature has been available on Apple Watch for a while, among the Accessibil­ity options. And while we’re talking about Apple Watch, let’s also mention the new on-device Siri speech understand­ing, which again is available on the new model solely because of its Neural comprehens­ion power.

All of these are very useful features. Neither requires new dedicated hardware such as sensors or mics to make them happen – it’s about the speed to run them to Apple’s satisfacti­on. We’re entering a world of very interestin­g software possibilit­ies with the rise of machinelea­rning tools and the hardware to run them in every device, but I don’t love that it’s bringing back a two-tier system of software, with divisions even between devices being sold today.

Features based on machine-learning techniques require Apple’s Neural chips

ABOUT MATT BOLTON

Matt is Managing Editor at TechRadar.com, and previously worked on T3, MacLife and MacFormat. He’s been charting Apple’s ups and downs since his student days, but still hopes to hear “one more thing”.

 ?? ?? The new ‘double tap’ gesture would be a great upgrade for existing Apple Watch users, but it’s on Series 9 only.
The new ‘double tap’ gesture would be a great upgrade for existing Apple Watch users, but it’s on Series 9 only.
 ?? ?? If I had just bought Apple’s most premium headphones, and they didn’t get the latest features, I’d be very unhappy.
If I had just bought Apple’s most premium headphones, and they didn’t get the latest features, I’d be very unhappy.
 ?? ??

Newspapers in English

Newspapers from Australia