Maximum PC

ON-DEVICE VS. IN THE CLOUD WHAT’S THE BEST WAY TO PROCESS AI?

-

Ask anyone for a real-world usable example of AI and they’ll probably pick one of the big three big digital assistants: Amazon Alexa, Apple Siri, or Google Assistant. But are such services better run locally on a device or up in the cloud?

Take Apple’s iPhone. You might be surprised just how much of the familiar iPhone functional­ity is driven or at least enhanced by AI. It’s used, for instance, to distinguis­h between intentiona­l inputs and accidental brushes against the capacitive touchscree­n.

Machine learning algorithms are also used to schedule charging, to improve long-term battery health, and to sort images in the Photo app into ready-made galleries and identify people using facial recognitio­n. Depending on the age of a given iPhone, it may also be able to use machine learning to composite multiple images to create a single ultra-high quality photo. Even Apple’s predictive text is enhanced by machine learning.

What’s particular­ly interestin­g about Apple’s approach to AI is how it fits into the contest between AI processed locally on devices compared to up in the cloud and in data centers. Broadly, Apple is moving away from cloud AI to on-device AI. Already, Apple has dedicated features like the Neural Engine in its chips that can be used to do things like image analysis locally on-device. In future, that approach will increasing­ly become the norm. The next iteration of Apple’s mobile operating system, iOS 15, for example, will pull the processing for the Siri voice assistant out of the cloud and put it on to the iPhone handset.

Partly, that’s down to privacy concerns, which Apple sees as a major distinguis­hing feature between its products and services and the likes of Amazon, Google, and Facebook, which Apple argues all depend on knowing as much about you as possible.

“This addresses one of the biggest privacy concerns for voice assistants, which is unwanted audio recording,” Apple said. But it begs the question of whether on-device AI can possibly compete with the huge machine learning models stored in data centers. Apple, predictabl­y enough, says that it can.

“I understand this perception that bigger models in data centers somehow are more accurate, but it’s actually wrong,” says John Giannandre­a, Apple’s senior vice president for machine learning and AI strategy. “It’s actually technicall­y wrong.

It’s better to run the model close to the data, rather than moving the data around. And whether that’s location data, like what are you doing, or exercise data, what’s the accelerome­ter doing in your phone, it’s better to be close to the source of the data. It’s also privacy preserving.”

Latency is another advantage of on-device AI.

Over to Giannandre­a: “You’re taking a photograph, and the moments before you take a photo with the camera, the camera’s seeing everything in real time. It can help you make a decision about when to take a photo. If you wanted to make that decision on the server, you’d have to send every single frame to the server to make a decision about how to take a photograph.” That just doesn’t make any sense, Giannandre­a says. It’s far better done on the device.

Of course, services that entirely depend on huge datasets will remain in the cloud for the foreseeabl­e future. And given Apple’s emerging stance on privacy, plus the company’s undoubted prowess in developing both devices and the chips that power them, the on-device sales pitch definitely makes for a distinct propositio­n. But it remains to be seen whether consumers really care about privacy and if a Neural Engine or matrix math accelerato­r in an iPhone really compete with exaflops of dedicated Google AI hardware in a data center.

 ??  ?? Unlike Google, Amazon, and Facebook, Apple prefers on- device AI to using data centers.
Unlike Google, Amazon, and Facebook, Apple prefers on- device AI to using data centers.

Newspapers in English

Newspapers from United States