HWM (Singapore)

APPLE’S NEW A12 BIONIC CHIP IS THE UNSUNG HERO FEATURE OF THE IPHONE XS

Its neural engine is evidence that Apple is serious about A.I.

- By Ng Chong Seng

Over the years, consumers have learned that Apple’s “S” suffix for the iPhone are about improvemen­ts under the hood. But there were always one or two standout innovation­s that went beyond the usual upgrades. Like Siri on iPhone 4S, iOS 7 and Touch ID on iPhone 5S, and 3D Touch on iPhone 6S.

This year, there seems to be nothing new. Sure, iPhone XS Max has a 6.5-inch OLED, the largest display ever on an iPhone, but that’s what people have been expecting since the iPhone X. In my opinion, the 6.1-inch edge-to-edge LCD on iPhone XR is the bigger of the two screen feats. But it’s also fair to say that it shouldn’t be spoken in the same breath as Siri and Touch ID, which were truly ground-breaking tech during the day.

Does this mean Apple has lost its touch? Did the world’s rst trillion-dollar company hit an innovation roadblock after its exertions last year, when it introduced the futuristic iPhone X with a Super Retina Display, TrueDepth camera system, and Face ID? I don’t think so.

When Apple CEO Tim Cook took the stage to announce the iPhone X in September 2017, he claimed that the phone will “set the path of technology for the next decade.” It’s a statement that he has uttered twice in two

earnings calls this year. The iPhone XS, XS Max, and XR, which take their design cues from the iPhone X, certainly look like a typical S-year upgrade. But delve deeper and you’ll realize that one improvemen­t isn’t like the others. I’m referring to the new A12 Bionic chip.

It’s a fact that Apple debuts a new processor for its latest iPhone every year. And the narrative for each new “A” series chip is this: apps run faster and games run smoother, with better visuals. The Apple-designed A12 Bionic SoC (system-on-chip) broadly follows this marketing template. That’s unfortunat­e because this means most consumers will overlook what Apple’s silicon team has achieved and what it means for the future of the iPhone.

For a start, the A12 Bionic is the industry’s rst mobile processor manufactur­ed on the state-of-theart 7nm process to ship in volume. A smaller process node allows Apple to pack more transistor­s on a chip. In the A12’s case, we’re looking at 6.9 billion transistor­s, which is way more than the 4.3 billion we’ve on the A11 from a year ago. Where do these nearly 7 billion transistor­s go to?

Like last year’s A11, the A12 Bionic has a six-core CPU made up of two performanc­e cores and four efficiency cores. But the performanc­e cores are now 15% faster and the efficiency cores now use 50% less power. All six CPU cores can also be turned on at the same time when you need a power boost.

The GPU is improved, too. Now a four-core design (previously three cores), Apple claims it’s up to 50% faster than the A11. Now all this is well and good, but it’s hard to imagine that you need 60% more transistor­s to pull that off.

Completing the processor trinity is what Apple calls the “neural engine.” This is the component that I posit bene ts the most from the increase in transistor count. It’s an eight-core part designed to handle machine learning and A.I. tasks, like Face ID, Animoji/ Memoji, Siri voice recognitio­n, and smart search in Photos.

The A11 has a neural engine too, but it pales in comparison to this “next-generation” version. According to Apple, the improved neural network of the A12 Bionic can perform 5 trillion operations per second, which is 8.3 times the A11’s 600 billion operations. Core ML, Apple’s machine learning framework, now runs 9x faster, which means more immersive augmented reality experience­s. With rumors of Apple incorporat­ing time-of ight (ToF) sensors in the 2019 iPhones and exploring AR glasses, it’s not hard to imagine that the company is paving the road now.

But AR and machine learning on phones are still in their infancy. If you need a compelling story today, know that the neural engine can also make existing features better. One prime example is Smart HDR, the marquee camera feature on the new iPhones.

According to Apple, Smart HDR is a result of hooking the neural engine into the phones’ ISP (image signal processor). The phone will take a four-photo buffer as well as several “interframe­s” each focused on capturing different data (e.g., varying exposures to get more highlight and shadow details). Instead of simply combining the photos, the neural engine will analyze each image and pick out the best part in each to create the nal photo.

In Portrait mode, the neural engine applies machine learning to analyze data from the camera sensor to distinguis­h faces. It also creates segmentati­on data to accurately separate the subject from the background and generate convincing bokeh. It’s thanks to the A12 Bionic chip and its neural engine that Apple is able to bring the advanced HDR and depth of eld effects to the iPhone XR, which only has a single-lens rear camera.

A.I. is a term brandished by many smartphone makers these days, but few can make the connection between the tech and the user bene ts like Apple does with the 2018 iPhones. There’s no doubt in my mind this neural engine will be central to Apple’s machine learning efforts, and the enabler of the coolest features we’ll see on iPhones in the next decade. This is only the beginning.

In just one generation, the improved neural network on the A12 Bionic can perform 5 trillion operations per second, which is 8.3 times the A11's 600 billion operations.

 ??  ??
 ??  ??

Newspapers in English

Newspapers from Singapore