Mac Format

how it works

All about Face ID and the TrueDepth camera

-

When Apple designed the iPhone X to be ‘all screen’, it created a couple of

conundrums. There had to be somewhere to put the front-facing camera, so the notch was born. (Apple is one of several companies that have applied for patents on ways for cameras to shoot through or between pixels, so a truly all-screen phone should be possible one day.)

Meanwhile, the Home button had to be sacrificed, and with it Touch ID. Apple could simply have put the fingerprin­t sensor somewhere else, but instead it switched to a new method of biometric authentica­tion using a sensor system, built into the notch, that it calls TrueDepth. It’s this hardware innovation that enables the iPhone X, XS, XS Max and XR to support Face ID.

Facial recognitio­n is a staple of science fiction and heist movies, but making it work in real life is hard. Simply picking out a face in the image captured by an ordinary camera would mean anyone who waved a photo of you at your iPhone would be as likely to get into it as you are. Earlier phones, including Samsung’s Galaxy Note 8, could be fooled in this way.

Instead of just taking a picture, TrueDepth detects the three-dimensiona­l shape of your face. The hardware, incorporat­ed into the notch, is made for Apple by the French-Italian semiconduc­tor company STMicroele­ctronics. On the right-hand side is a vertical-cavity surface-emitting laser (VCSEL). Just as an LED is a diode that emits light, a VCSEL is a diode that emits a laser beam.

The technology has been around for decades, but it’s new to the mass market, and the iPhone’s demand has vastly exceeded previous manufactur­ing capacity. In December 2017, Apple awarded $390m (about £300m) from its Advanced Manufactur­ing Fund to help optical component maker Finisar build a VCSEL factory in Texas, creating 500 jobs.

3D modelling

In front of the VCSEL is an optical filter, from Austrian maker Ams AG, that rapidly redirects its laser beam using tiny glass mirrors to project a grid of over 30,000 dots in a fraction of a second. The dots are invisible to the human eye, but a night-vision camera would see them spring into action when the iPhone’s proximity sensor – third from the left in the notch – tells it a face might be looming.

To the left again is another infrared source, a flood illuminato­r: basically infrared flash. With both this and the laser shining on your face, the TrueDepth infrared (IR) camera – the notch’s leftmost component – can capture both the dot pattern and a detailed image. As it’s IR, it works as well in daylight or darkness.

If the dots landed on a flat surface, they’d form a regular grid. On your irregularl­y shaped

face, the divergence of their positions can be used to calculate a 3D model. This all happens behind the scenes with Face ID, but the same process is accessible to software through FaceMesh functions in ARKit, and apps such as MeasureKit (£4.99, see MF 320) can visualise it for you, showing how joining the dots reconstruc­ts your head.

It’s this 3D model, not how your face looks, that enables Face ID to recognise you. The optical image that’s captured simultaneo­usly adds detail that’s used by iOS’s Attention Aware features, which recognise that your eyes are pointing at the device and, for example, avoid auto-dimming the screen while you’re looking at it. By default, Face ID requires attention, so you have to make eye contact when presenting your face. This is to help avoid accidental unlocks and prevent someone grabbing your phone while you’re incapacita­ted and unlocking it with your face. (The FBI, however, has already used a search warrant to demand that a suspect look at his iPhone X to unlock it.)

TrueDepth has much wider applicatio­ns than Face ID. It’s also the basis of Animoji and Memoji, animated characters that track the movement of your face, eyes and tongue. One day, incorporat­ing a system similar to TrueDepth into a device’s rear camera array could greatly enhance iOS’s augmented reality capabiliti­es, which currently rely on the regular camera image and orientatio­n sensing.

The next update to the iPad is expected to add TrueDepth to the front. As for Macs, recent Apple patents describe both facial recognitio­n (with automatic login) and using a depth sensor to allow macOS to be controlled by gestures in the air.

Five years ago, Apple acquired PrimeSense, which was behind the tech in Microsoft Kinect, but we have yet to see Apple pursue this kind of interactio­n on the desktop.

 ??  ?? Visualisat­ions of Face ID’s infrared flood image and dot mesh pattern.
Visualisat­ions of Face ID’s infrared flood image and dot mesh pattern.
 ??  ?? The MeasureKit app visualises TrueDepth’s modelling via FaceMesh.
The MeasureKit app visualises TrueDepth’s modelling via FaceMesh.

Newspapers in English

Newspapers from Australia