Ap­ple said to tar­get rear-fac­ing 3-D sen­sor for 2019 iPhone

The Mercury News - - Business + Technology - By Alex Webb and Yuji Naka­mura

Ap­ple is work­ing on a rear­fac­ing 3-D sen­sor sys­tem for the iPhone in 2019, an­other step to­ward turn­ing the hand­set into a lead­ing aug­mented-re­al­ity de­vice, ac­cord­ing to peo­ple fa­mil­iar with the plan.

Ap­ple is eval­u­at­ing a dif­fer­ent tech­nol­ogy from the one it cur­rently uses in the TrueDepth sen­sor sys­tem on the front of the iPhone X, the peo­ple said. The ex­ist­ing sys­tem re­lies on a struc­tured-light tech­nique that projects a pat­tern of 30,000 laser dots onto a user’s face and mea­sures the dis­tor­tion to gen­er­ate an ac­cu­rate 3-D im­age for au­then­ti­ca­tion. The planned rear-fac­ing sen­sor would in­stead use a time-of­flight ap­proach that cal­cu­lates the time it takes for a laser to bounce off sur­round­ing ob­jects to cre­ate a three-di­men­sional pic­ture of the en­vi­ron­ment.

The com­pany is ex­pected to keep the TrueDepth sys­tem, so fu­ture iPhones will have both front and rear-fac­ing 3-D sens­ing ca­pa­bil­i­ties. Ap­ple has started dis­cus­sions with prospec­tive sup­pli­ers of the new sys­tem, the peo­ple said. Com­pa­nies man­u­fac­tur­ing time-of-flight sen­sors in­clude In­fi­neon Tech­nolo­gies, Sony Corp., STMi­cro­elec­tron­ics and Pana­sonic. The test­ing of the tech­nol­ogy is still in early stages and it could end up not be­ing used in the fi­nal ver­sion of the phone, the peo­ple said. They asked not to be iden­ti­fied dis­cussing un­re­leased fea­tures. An Ap­ple spokes­woman de­clined to com­ment.

The ad­di­tion of a rear-fac­ing sen­sor would en­able more aug­mented-re­al­ity ap­pli­ca­tions in the iPhone. Ap­ple Chief Ex­ec­u­tive Of­fi­cer Tim Cook con­sid­ers AR po­ten­tially as rev­o­lu­tion­ary as the smart­phone it­self. He’s talked up the tech­nol­ogy on Good Morn­ing Amer­ica and gives it as al­most much at­ten­tion dur­ing earn­ings calls as sales growth.

“We’re al­ready see­ing things that will trans­form the way you work, play, con­nect and learn,” he said in the most re­cent call. “AR is go­ing to change the way we use tech­nol­ogy for­ever.”

Ap­ple added a soft­ware tool called ARKit this year that made it eas­ier for de­vel­op­ers to make apps for the iPhone us­ing AR. The tool is good at iden­ti­fy­ing flat sur­faces and plac­ing vir­tual ob­jects or im­ages on them. But it strug­gles with ver­ti­cal planes, such as walls, doors or win­dows, and lacks

ac­cu­rate depth per­cep­tion, which makes it harder for dig­i­tal im­ages to in­ter­act with real things. So if a dig­i­tal tiger walks be­hind a real chair, the chair is still dis­played be­hind the an­i­mal,

de­stroy­ing the il­lu­sion. A rear-fac­ing 3-D sen­sor would help rem­edy that.

The iPhone X uses its front-fac­ing 3-D sen­sor for Face ID, a fa­cial-recog­ni­tion sys­tem that re­placed the fin­ger­print sen­sor used in ear­lier mod­els to un­lock the hand­set. Pro­duc­tion prob­lems with the sen­sor ar­ray ini­tially slowed man­u­fac­tur­ing

of the flag­ship smart­phone, partly be­cause the com­po­nents must be as­sem­bled to a very high de­gree of ac­cu­racy.

While the struc­tured light ap­proach re­quires lasers to be po­si­tioned very pre­cisely, the time-of-flight tech­nol­ogy re­lies on a more ad­vanced im­age sen­sor. That may make time-of­flight sys­tems eas­ier to as­sem­ble in high vol­ume.

Al­pha­bet’s Google has been work­ing with In­fi­neon on depth per­cep­tion in its AR de­vel­op­ment push, Project Tango, un­veiled in 2014. The In­fi­neon chip is used in Len­ovo Group’s Phab 2 Pro and Asustek Com­puter’s ZenFone AR; both use An­droid.

Newspapers in English

Newspapers from USA

© PressReader. All rights reserved.