Dig­i­tal sense IBM pre­dicts that in five years com­put­ers will have all senses like hu­mans. We study how de­vices are ac­quir­ing sight, smell and touch.

WE ARE REALLY CLOSE TO THE DAY W WILL HAVE ONE OR ALL OF THE SENS WHEN COM­PUT­ERS AND PHONES SES HU­MANS ARE SO PROUD OF

Gadgets and Gizmos (India) - - CONTENTS - BY NANDAGOPAL NANDAGOPAL RA­JAN

Re­cently, IBM re­leased a list of in­no­va­tions that have the po­ten­tial to change the way peo­ple work, live and in­ter­act dur­ing the next five years. The list, based on mar­ket and so­ci­etal trends as well as emerg­ing tech­nolo­gies from IBM's R&D labs, says touch, sight, hear­ing, taste and smell will be the next big things in com­put­ing.

“We have al­ready wit­nessed the ben­e­fits of cog­ni­tive sys­tems for ad­vanc­ing numer­ous as­pects of the hu­man ex­pe­ri­ence - from agri­cul­ture and health­care to util­ity man­age­ment and weather fore­cast­ing. We en­vi­sion a day when com­put­ers make sense of the world around them just like hu­man brain re­lies on in­ter­act­ing with the world us­ing mul­ti­ple senses,” said Ramesh Gopinath, Di­rec­tor - In­dia Re­search Lab and Chief Tech­nol­ogy Of­fi­cer, IBM In­dia/South Asia.

We take a look at what IBM and oth­ers are do­ing to bring the senses to your com­puter or phone. In­ter­est­ingly, al­most all of th­ese tech­nolo­gies are at ad­vanced stages of test­ing.

So while IBM says they will be avail­able for con­sumers by 2017, don’t be sur­prised if some of th­ese tech­nolo­gies takes a short­cut to come to a de­vice that will be up for sale in a cou­ple of years.

Here is how the senses are find­ing their way into our dig­i­tal lives:

SIGHT

By the end of this decade com­put­ers will not only be able to look at and recog­nise the con­tents of im­ages and vis­ual data, they will also start mak­ing sense of the pix­els like a hu­man views and in­ter­prets a pho­to­graph. So fu­ture com­put­ers will know that a red light means stop, and will be able to in­ter­pret sig­nage on a road. A pre­cur­sor to this can be seen in the form of the Google Gog­gles app that recog­nises prod­ucts from pho­to­graphs and gives you info on the same. But IBM says that in five years, th­ese ca­pa­bil­i­ties will be put to work in health­care by mak­ing sense out of mas­sive vol­umes of med­i­cal in­for­ma­tion. For in­stance com­put­ers will be able to dif­fer­en­ti­ate healthy from dis­eased tis­sue. An­other, use could be to use cam­eras as body scan­ners to tell which out­fit will be a per­fect fit for a per­son. The ap­parel in­dus­try is al­ready ex­per­i­ment­ing with this tech­nol­ogy with an eye on how this could bring in more on­line buy­ers.

TOUCH

Sci­en­tists have for decades been try­ing to bring touch and feel to me­chan­i­cal de­vices. Now, this dream is be­com­ing more of a re­al­ity. So we could have mo­bile de­vices that will al­low you to touch and feel prod­ucts thus re­defin­ing re­tail busi­ness across the world. IBM says its sci­en­tists are de­vel­op­ing ap­pli­ca­tions for the re­tail, health­care and other sec­tors us­ing hap­tic, in­frared and pres­sure sen­si­tive tech­nolo­gies to sim­u­late touch, such as the tex­ture and weave of a fab­ric – imag­ine a shop­per touch­ing the screen to feel the tex­ture of a fab­ric that she wants to buy. This tech­nol­ogy will use the vi­bra­tion ca­pa­bil­i­ties of the phone, as­sign­ing a unique set of vi­bra­tion pat­terns that recre­ate their tex­ture to each ar­ti­cle.

SMELL

In the next five years, tiny sen­sors em­bed­ded in your com­puter or cell phone will de­tect if you’re coming down with a cold or other ill­ness. By analysing odours, biomark­ers and thou­sands of mol­e­cules in some­one’s breath, doc­tors will have help di­ag­nos­ing and mon­i­tor­ing the on­set of ail­ments such as liver and kid­ney dis­or­ders, asthma, di­a­betes and epilepsy by de­tect­ing which odours are nor­mal and which are not. IBM sci­en­tists are al­ready sens­ing en­vi­ron­men­tal con­di­tions and gases to pre­serve works of art. Com­pa­nies like DigiS­cents and TriSenx are de­vel­op­ing de­vices that will be able to recre­ate smells when con­nected to a com­puter. Imag­ine your­self hold­ing your nose as you watch a video of a fish mar­ket. For in­stance, DigiS­cents has in­dexed thou­sands of smells based on their chem­i­cal struc­ture and their place on the scent spec­trum, be­fore cod­ing and digi­tis­ing them into a small file that can be em­bed­ded in web con­tent. Mean­while, IBM tech­nol­ogy will “smell” sur­faces for dis­in­fec­tants to de­ter­mine whether rooms have been sani­tised. Us­ing novel wire­less “mesh” net­works, data on var­i­ous chem­i­cals will be gath­ered and mea­sured by sen­sors, and con­tin­u­ously learn and adapt to new smells over time.

TASTE

IBM re­searchers are de­vel­op­ing a com­put­ing sys­tem that ex­pe­ri­ences flavour. It works by break­ing down in­gre­di­ents to their molec­u­lar level and blend the chem­istry of food com­pounds with the psychology be­hind what flavours and smells hu­mans pre­fer. By com­par­ing this with mil­lions of recipes, the sys­tem will be able to cre­ate new flavour com­bi­na­tions that pair. The com­puter will be able to use al­go­rithms to de­ter­mine the pre­cise chem­i­cal struc­ture of food and why peo­ple like cer­tain tastes. At the Univer­sity of Tsukuba in Ja­pan, re­searchers are work­ing on a food sim­u­la­tor that that can mimic the taste and “mouth­feel” of food. Most of th­ese com­put­ers use al­go­rithms to de­ter­mine the pre­cise chem­i­cal struc­ture of food and why peo­ple like cer­tain tastes. Th­ese al­go­rithms will ex­am­ine how chem­i­cals in­ter­act with each other, the molec­u­lar com­plex­ity of flavour com­pounds and their bond­ing struc­ture, and use that in­for­ma­tion, to­gether with models of per­cep­tion to pre­dict

taste ap­peal.

SOUND

When com­put­ers start hear­ing, a dis­trib­uted sys­tem of clever sen­sors will de­tect el­e­ments of sound such as sound pres­sure, vi­bra­tions and sound waves at dif­fer­ent fre­quen­cies and in­ter­pret to pre­dict when trees will fall in a for­est or when a land­slide is im­mi­nent. It can also gauge the mood of a speaker, or an­a­lyse whether he is ly­ing. The sys­tems will pin­point as­pects of a con­ver­sa­tion and an­a­lyse pitch, tone and hes­i­tancy to help us have more pro­duc­tive di­a­logues that could im­prove cus­tomer call cen­tre in­ter­ac­tions, or al­low us to seam­lessly in­ter­act with dif­fer­ent cul­tures. Sci­en­tists are now study­ing un­der­wa­ter noise lev­els to un­der­stand im­pact of wave en­ergy con­ver­sion machines on sea life.

Newspapers in English

Newspapers from India

© PressReader. All rights reserved.