Business World

Apple developing rear-facing 3-D sensor for 2019 iPhone

-

APPLE, INC. is working on a rearfacing 3-D sensor system for the iPhone in 2019, another step toward turning the handset into a leading augmented-reality (AR) device, according to people familiar with the plan.

Apple is evaluating a different technology from the one it currently uses in the TrueDepth sensor system on the front of the iPhone X, the people said. The existing system relies on a structured- light technique that projects a pattern of 30,000 laser dots onto a user’s face and measures the distortion to generate an accurate 3- D image for authentica­tion. The planned rear- facing sensor would instead use a time- of- flight approach that calculates the time it takes for a laser to bounce off surroundin­g objects to create a three- dimensiona­l picture of the environmen­t.

The company is expected to keep the TrueDepth system, so future iPhones will have both front and rear- facing 3- D sensing capabiliti­es. Apple has started discussion­s with prospectiv­e suppliers of the new system, the people said. Companies manufactur­ing time- offlight sensors include Infineon Technologi­es AG, Sony Corp., STMicroele­ctronics NV and Panasonic Corp. The testing of the technology is still in early stages and it could end up not being used in the final version of the phone, the people said. They asked not to be identified discussing unreleased features. An Apple spokeswoma­n declined to comment.

The addition of a rear-facing sensor would enable more augmentedr­eality applicatio­ns in the iPhone. Apple Chief Executive Officer Tim Cook considers AR potentiall­y as revolution­ary as the smartphone itself. He’s talked up the technology on Good Morning America and gives it as almost much attention during earnings calls as sales growth. “We’re already seeing things that will transform the way you work, play, connect and learn,” he said in the most recent call. “AR is going to change the way we use technology forever.”

Apple added a software tool called ARKit this year that made it easier for developers to make apps for the iPhone using AR. The tool is good at identifyin­g flat surfaces and placing virtual objects or images on them. But it struggles with vertical planes, such as walls, doors or windows, and lacks accurate depth perception, which makes it harder for digital images to interact with real things. So if a digital tiger walks behind a real chair, the chair is still displayed behind the animal, destroying the illusion. A rear- facing 3- D sensor would help remedy that.

The iPhone X uses its frontfacin­g 3-D sensor for Face ID, a facial- recognitio­n system that replaced the fingerprin­t sensor used in earlier models to unlock the handset. Production problems with the sensor array initially slowed manufactur­ing of the flagship smartphone, partly because the components must be assembled to a very high degree of accuracy.

Alphabet, Inc.’s Google has been working with Infineon on depth perception as part of its AR developmen­t push, Project Tango, unveiled in 2014. The Infineon chip is already used in Lenovo Group Ltd.’s Phab 2 Pro and Asustek Computer, Inc.’s ZenFone AR, both of which run on Google’s Android operating system.

 ??  ?? AN ATTENDEE uses a new iPhone X during a presentati­on for the media in Beijing, China on Oct. 31.
AN ATTENDEE uses a new iPhone X during a presentati­on for the media in Beijing, China on Oct. 31.

Newspapers in English

Newspapers from Philippines