Ap­ple eval­u­ates aug­mented re­al­ity for fu­ture iPhones

Irish Examiner - - Business - Alex Webb and Yuji Naka­mura

Ap­ple is work­ing on a rear­fac­ing 3D sen­sor sys­tem for the iPhone in 2019, an­other step to­ward turn­ing the hand­set into a lead­ing aug­mented-re­al­ity de­vice, ac­cord­ing to sources.

Ap­ple is eval­u­at­ing a dif­fer­ent tech­nol­ogy from the one it cur­rently uses in the TrueDepth sen­sor sys­tem on the front of the iPhone X. The ex­ist­ing sys­tem re­lies on a struc­tured-light tech­nique that projects a pat­tern of 30,000 laser dots onto a user’s face and mea­sures the dis­tor­tion to gen­er­ate an ac­cu­rate 3-D im­age for au­then­ti­ca­tion. The planned rear­fac­ing sen­sor would in­stead use a time-of-flight ap­proach that cal­cu­lates the time it takes for a laser to bounce off sur­round­ing ob­jects to cre­ate a three-di­men­sional pic­ture of the en­vi­ron­ment.

The com­pany is ex­pected to keep the TrueDepth sys­tem, so fu­ture iPhones will have both front and rear-fac­ing 3D sensing ca­pa­bil­i­ties.

Ap­ple has started dis­cus­sions with prospec­tive sup­pli­ers of the new sys­tem, the sources said.

Com­pa­nies man­u­fac­tur­ing time-of-flight sensors in­clude In­fi­neon Tech­nolo­gies, Sony, STMi­cro­elec­tron­ics, and Pana­sonic.

The test­ing of the tech­nol­ogy is still in early stages and it could end up not be­ing used in the fi­nal ver­sion of the phone. An Ap­ple spokes­woman de­clined to com­ment.

The ad­di­tion of a rear­fac­ing sen­sor would en­able more aug­mented-re­al­ity ap­pli­ca­tions in the iPhone.

Ap­ple chief ex­ec­u­tive Tim Cook con­sid­ers AR po­ten­tially as rev­o­lu­tion­ary as the smart­phone it­self. He’s talked up the tech­nol­ogy on Amer­i­can tele­vi­sion and gives it as al­most much at­ten­tion dur­ing earn­ings calls as sales growth.

“We’re al­ready see­ing things that will trans­form the way you work, play, con­nect and learn,” he said. “AR is go­ing to change the way we use tech­nol­ogy for­ever.”

Ap­ple added a soft­ware tool called ARKit this year that made it easier for de­vel­op­ers to make apps for the iPhone us­ing AR.

The tool is good at iden­ti­fy­ing flat sur­faces and plac­ing vir­tual ob­jects or images on them. But it strug­gles with ver­ti­cal planes, such as walls, doors or win­dows, and lacks ac­cu­rate depth per­cep­tion, which makes it harder for dig­i­tal images to in­ter­act with real things.

So if a dig­i­tal tiger walks be­hind a real chair, the chair is still dis­played be­hind the an­i­mal, de­stroy­ing the il­lu­sion. A rear-fac­ing 3-D sen­sor would help rem­edy that.

Pro­duc­tion prob­lems with the sen­sor ar­ray ini­tially slowed man­u­fac­tur­ing of the flag­ship smart­phone, partly be­cause the com­po­nents must be as­sem­bled to a very high de­gree of ac­cu­racy.

Bloomberg

Pic­ture: Ste­fan Rousseau/PA Wire

A man uses the fa­cial recog­ni­tion fea­ture on an iPhone X in the Ap­ple Store on Re­gent St, London, af­ter the new hand­set went on sale this month. Ap­ple is work­ing on a rear-fac­ing 3D sen­sor for the iPhone in 2019.

Newspapers in English

Newspapers from Ireland

© PressReader. All rights reserved.