What the heck?
What will depth sensing do next?
Adam Banks takes a close look at what depth sensing will do next.
Research into depth sensing (also known as range imaging) by Apple and others has three prongs: engineering new sensors that can measure the distance to points in a scene; leveraging multi–purpose camera technologies to approximate similar results; and developing software to best use the data.
The notch on the front of every 2018 iPhone contains one camera plus a TrueDepth sensor. The latter projects thousands of infra–red dots onto the subject, which are then read back as a flat image, from which software reconstructs a 3D shape. The system both equips Face ID to recognize faces without being fooled by, say, a photo of the user, and helps to separate foreground from background so that Portrait mode can blur the latter.
No iPhone yet has a depth sensor on the back. Instead, images taken simultaneously by the twin optical rear cameras, which have different focal lengths, are compared in software to calculate depth. This is the basis of ARKit apps, which may add computer–generated elements to a live view or take measurements of objects. It’s also used for Portrait mode, except on the iPhone XR, which has only a single rear camera. This instead relies on the Focus Pixels feature: each cell in the camera sensor has two photodiodes, creating two images offset by about a millimetre — barely enough for depth estimation, but enhanced by machine learning that targets faces.
The next step for depth sensing will come first to the back of the iPhone, leapfrogging TrueDepth and enabling more precise AR, as well as more sophisticated depth–of– field photo effects. It’s called time of flight (ToF), and rather than just capturing an image of where projected laser dots land, it measures how long the beam takes to return, creating a 3D model within the sensor’s field of view. Tech analyst Ming–Chi Kuo reckons ToF could arrive in the iPad in late 2019 or early 2020, heralding a new generation of practical AR measurement and simulation apps. Squeezing it into iPhones may take a little longer.