Robots still lack a crit­i­cal el­e­ment that will keep them from eclips­ing most hu­man ca­pa­bil­i­ties any­time soon—a well-de­vel­oped sense of touch

The Financial Express - - FRONT PAGE - John Markoff

Robots still lack a crit­i­cal el­e­ment— a well-de­vel­oped sense of touch

IN FAC­TO­RIES and ware­houses, robots rou­tinely outdo hu­mans in strength and pre­ci­sion. Ar­ti­fi­cial in­tel­li­gence soft­ware can drive cars, beat grand­mas­ters at chess and leave ‘Jeop­ardy!’ cham­pi­ons in the dust.

But ma­chines still lack a crit­i­cal el­e­ment that will keep them from eclips­ing most hu­man ca­pa­bil­i­ties any­time soon: a well-de­vel­oped sense of touch.

Con­sider Niko­las Blevins, a head and neck sur­geon at Stan­ford Health Care, who rou­tinely per­forms ear op­er­a­tions re­quir­ing that he shave away bone deftly enough to leave an in­ner sur­face as thin as the mem­brane in an eggshell.

Blevins is col­lab­o­rat­ing with the roboti­cists J Ken­neth Sal­is­bury and Sonny Chan on de­sign­ing soft­ware that will make it pos­si­ble to re­hearse th­ese op­er­a­tions be­fore per­form­ing them. The pro­gramme blends X-ray and mag­netic res­o­nance imag­ing data to cre­ate a vivid three-di­men­sional model of the in­ner ear, al­low­ing the sur­geon to prac­tise drilling away bone, to take a visual tour of the pa­tient’s skull and to vir­tu­ally ‘feel’ sub­tle dif­fer­ences in car­ti­lage, bone and soft tis­sue. Yet no mat­ter how thor­ough or re­fined, the soft­ware pro­vides only the rough­est ap­prox­i­ma­tion of Blevins’s sen­si­tive touch.

“Be­ing able to do vir­tual surgery, you re­ally need to have hap­tics,” he said, re­fer ring to the tech­nol­ogy that makes it pos­si­ble to mimic the sen­sa­tions of touch in a com­puter sim­u­la­tion.

The soft­ware’s lim­i­ta­tions typ­ify those of ro­bot­ics, in which re­searchers lag in de­sign­ing ma­chines to per­form tasks that hu­mans rou­tinely do in­stinc­tively. Since the first ro­botic arm was de­signed at the Stan­ford Ar­ti­fi­cial In­tel­li­gence Lab­o­ra­tory in the 1960s, robots have learned to per­form repet­i­tive fac­tory work, but they can barely open a door, pick them­selves up if they fall, pull a coin out of a pocket or twirl a pen­cil.

The cor­re­la­tion be­tween highly evolved ar­ti­fi­cial in­tel­li­gence and phys­i­cal in­ept­ness even has a name: Mo­ravec’s para­dox, after the ro­bot­ics pi­o­neer Hans Mo­ravec. Ad­vances in hap­tics and kine­mat­ics, the study of mo­tion con­trol in jointed bod­ies, are es­sen­tial if robots are ever to col­lab­o­rate with hu­mans in hoped-for roles like food ser­vice worker, med­i­cal or­derly, of­fice se­cre- tary and health care as­sis­tant.

“It just takes time, and it’s more com­pli­cated,” Ken Gold­berg, a roboti­cist at the Univer­sity of Cal­i­for­nia, Berke­ley, said of such ad­vances. “Hu­mans are re­ally good at this, and they have mil­lions of years of evo­lu­tion.” Touch is a much more com­pli­cated sense than one might think. Hu­mans have an ar­ray of or­gans that al­low them to sense pres­sure, sheer forces, tem­per­a­ture and vi­bra­tions with re­mark­able pre­ci­sion.

Re­search sug­gests that our sense of touch is ac­tu­ally sev­eral or­ders of mag­ni­tude finer than pre­vi­ously be­lieved. Phys­i­ol­o­gists have shown that the in­ter­ac­tion be­tween a fin­ger and a sur­face is de­tected by or­gans called mechanore­cep­tors, which are em­bed­ded at dif­fer­ent depths in the skin. Some are sen­si­tive to changes in an ob­ject’s size or shape and oth­ers to vi­bra­tions.

In the case of tiny sur­face vari­a­tions, cues come from Pacinian cor­pus­cles, oval-shaped struc­tures about a mil­lime­tre long that sig­nal when they are de­formed.

Repli­cat­ing that sen­si­tiv­ity is the goal of hap­tics, a sci­ence that is play- ing an in­creas­ing role in con­nect­ing the com­put­ing world to hu­mans. One of the most sig­nif­i­cant ad­vances in hap­tics has been made by Mako Sur­gi­cal, founded in 2004 by the roboti­cist Rony Abovitz. In 2006, Mako be­gan of­fer­ing a ro­bot that pro­vides pre­cise feed­back to sur­geons re­pair­ing arthritic knee joints.

“I thought hap­tics was a way to com­bine ma­chine in­tel­li­gence and hu­man in­tel­li­gence in a way that the ma­chine would do what it was good at and the hu­man would do what the hu­man was good at, and there was this re­ally in­ter­est­ing sym­bio­sis that could come about,” Abovitz said, adding: “The sur­geon still has the sense of con­trol and can put the en­ergy into the mo­tion and push. But all of the in­tel­li­gent guid­ance and what you thought the sur­geon would nor­mally do is done by the ma­chine.”

Even in in­dus­tries where robots are en­trenched, ex­perts worry about the dan­gers they pose to the peo­ple who work along­side them. Robots have caused dozens of work­place deaths and in­juries in the US; if a ro­bot revo­lu­tion is ever to take place, sci­en­tists will have to cre­ate ma- chines that meet ex­act­ing safety stan­dards—and do it in­ex­pen­sively.

“For the last 30 years, in­dus­trial robots have fo­cused on one metric: be­ing fast and cheap,” said Kent Massey, the di­rec­tor of ad­vanced pro­grammes at HDT Global, a ro­bot­ics firm based in Solon, Ohio. “It has been about speed. It’s been awe­some, but a stan­dard arm to­day is pre­cise and stiff and heavy, and they’re re­ally dan­ger­ous.”

Massey’s company is one of a num­ber of ro­bot-arm de­sign­ers that are be­gin­ning to build safer ma­chines. Re­think Ro­bot­ics in Bos­ton and Univer­sal Robots in Den­mark have built ‘com­pli­ant’ robots that sense hu­man con­tact. The Univer­sal sys­tem uses a com­bi­na­tion of sen­sors in its joints and soft­ware, and the Re­think ro­bot uses ‘se­ries elas­tic ac­tu­a­tors’—es­sen­tially springs in the joints that mimic the com­pli­ance of hu­man mus­cles and ten­dons and acous­tic sen­sors so the ro­bot can slow when hu­mans ap­proach.

Beyond ad­vances nec­es­sary for ba­sic safety, sci­en­tists are fo­cus­ing on more sub­tle as­pects of touch. Last year, re­searchers at Ge­or­gia Tech re­ported in the jour­nal Sci­ence that they had fab­ri­cated bun­dles of tiny tran­sis­tors called tax­els to mea­sure changes in elec­tri­cal charges that sig­nal me­chan­i­cal strain or pres­sure. The goal is to de­sign touch-sen­si­tive ap­pli­ca­tions, in­clud­ing ar­ti­fi­cial skin for robots and other de­vices.

Muchre­search­is­fo­cusin­gonvi­sion and its role in touch. The new­est da Vinci Xi, a surgery sys­tem de­vel­oped byIn­tu­itiveSur­gi­cal,use­shigh-res­o­lu­tion 3D cam­eras to en­able doc­tors to per­form del­i­cate op­er­a­tions re­motely, ma­nip­u­lat­ing tiny sur­gi­cal in­stru­ments.The­com­pa­ny­fo­cuse­don­giv­ing sur­geons bet­ter vi­sion, be­cause the nec­es­sary touch for op­er­at­ing on soft tis­sue like or­gans is still beyond the ca­pa­bil­ity of hap­tics tech­nol­ogy.

Curt Sal­is­bury, a prin­ci­pal re­search en­gi­neer at SRI In­ter­na­tional, a non-profit re­search in­sti­tute, said that while sur­geons could rely on visual cues pro­vided by soft tis­sues to un­der­stand the forces ex­erted by their tools, there were times when vi­sion alone would not suf­fice. “Hap­tic feed­back is crit­i­cal when you don’t have good visual ac­cess,” he said.

Other re­searchers be­lieve that ad­vances in sen­sors that more ac­cu­rate- ly model hu­man skin, as well as al­go­rithms that fuse vi­sion, hap­tics and kine­mat­ics, will lead to vast im­prove­mentsinthenextgen­er­a­tionof robots.

One path is be­ing pur­sued by Ed­uardo Tor­res-Jara, an as­sis­tant pro­fes­sor of ro­bot­ics at Worces­ter Poly­tech­nic In­sti­tute in Mas­sachusetts, who has de­fined an al­ter­na­tive the­ory he de­scribes as ‘sen­si­tive ro­bot­ics’. He has cre­ated a model of ro­botic mo­tion, grasp­ing and ma­nip­u­la­tion that be­gins with sim­ply know­ing where the ro­bot’s feet or hands meet the ground or an ob­ject. Us­ing bi­o­log­i­cally in­spired ar­ti­fi­cial skin that can de­tect tiny changes in mag­netic forces, he has built a two-legged walk­ing ro­bot that is able to bal­ance and stride by mea­sur­ing chang­ing forces on the bot­toms of its feet.

If im­prov­ing tac­tile per­for­mance de­pends on greater com­put­ing power, help may be on the way. Gold­berg, the Berke­ley roboti­cist, has be­gun de­sign­ing cloud-based ro­botic sys­tems that can tap vast pools of com­put­ing power via the In­ter­net.

In July, roboti­cists at Brown, Cor­nell, Stan­ford and Berke­ley de­scribed a data­base called Robo Brain, spon­sored­bytheNa­tion­alS­cienceFoun­da­tion, that is in­tended to of­fer an In­ter­net-based repos­i­tory of images and videos to give robots support for per­form­ing ac­tions in the phys­i­cal world. Other­hap­tic­sre­searchers­be­lievethat ar­ti­fi­cially repli­cat­ing touch will have a pow­er­ful ef­fect on the de­vel­op­ment of au­ton­o­mous robots, as well as sys­tems that aug­ment hu­mans.

Last fall, Al­li­son Oka­mura, an as­so­ciate pro­fes­sor of me­chan­i­cal en­gi­neer­ing at the Lab­o­ra­tory for Col­lab­o­ra­tive Hap­tics and Ro­bot­ics in Medicine at Stan­ford, taught an on­line course in hap­tics. Stu­dents as­sem­bled ‘hap­kits’ de­signed by the Stan­ford ed­u­ca­tion pro­fes­sor Paulo Blik­stein, then pro­grammed them to cre­ate vir­tual de­vices like springs and dampers that could be ma­nip­u­lated as if they were in the real world.

The stu­dents fol­lowed with new projects, tweak­ing the hard­ware and shar­ing pro­grammes they had cre­ated. Oka­mura said their en­thu­si­asm was un­der­stand­able.

“If you have all th­ese senses—vi­sion, hear­ing, taste, touch and smell—and some­one took them away from you one by one, which is the last one you would give up?” she asked. “Almost ev­ery­one says vi­sion, but for me, it would be touch.”

Re­search sug­gests that our sense of touch is ac­tu­ally sev­eral or­ders of mag­ni­tude finer than pre­vi­ously be­lieved. Repli­cat­ing that sen­si­tiv­ity is the goal of hap­tics, a sci­ence that is play­ing an in­creas­ing role in con­nect­ing the com­put­ing world to hu­mans

Newspapers in English

Newspapers from India

© PressReader. All rights reserved.