Toronto Star

Sometimes when we touch: How a microchip can identify objects

Scottish researcher­s use Soli chip to ‘read’ radar signatures of things touching it

- PATTY WINSA FEATURE WRITER

Google created a new microchip to allow touchless interactio­n with smartphone­s. But researcher­s at the University of St. Andrews in Scotland stumbled upon an unexpected use: it can be used to identify objects that it touches.

Google developed the Soli chip so that devices could recognize fine finger gestures made a distance from the screen. The chip sends out radar signals and processes the informatio­n that bounces back from the movement of your fingers in the air, so it knows if you are miming a tap on an icon or a swipe across your screen.

Inside the chip is an antenna device with four transmitti­ng units and two receiving units. Each transmitti­ng unit fires up and emits a signal in a particular part of the frequency range, from 57 to 60 GHz, then shuts off and the next one turns on, and so on. When the signal hits an object, part of the signal is scattered from the front of the object, other parts are reflected from the inside and still others reflected from the back of the object. Those return signals are captured by Soli.

Prof. Aaron Quigley, chair of human computer interactio­n at St. Andrews, says the university was one of 40 institutio­ns to receive a Soli alpha developmen­t kit from Google. The kit, about the size of a pack of playing cards, contains the chip and around it, circuitry and hardware to collect the data and process it.

At first, his team of two student researcher­s used the kit to try to measure wrist movement in the air. But when one of them placed their wrist right on the kit, they realized that if they sampled the return radar signals hundreds of times, the pattern was unique enough to identify a wrist, as well as inanimate objects.

The unique radar patterns, which can identify the surface characteri­stics of objects, their compositio­n and identify rear surfaces as well, were fed into a software program they created called RadarCat, or radar categoriza­tion of objects. The result was a dictionary of objects. This is called supervised machine learning — the software program can’t figure it out on its own the first time.

But once RadarCat does learn an object, researcher­s say testing showed the program will identify it again — not only inanimate objects but living ones as well, like your wrist, although the program is less accurate when it comes to body parts.

With more developmen­t though, it could mean that one day your phone will know where it is on your body and change its interface accordingl­y, creating a louder ring tone because it’s in your pocket or larger icons because it knows you’re tapping with a gloved hand.

“What we’ve done is show the research is possible,” Quigley says.

 ?? SOURCE: Google TORONTO STAR GRAPHIC ?? Google’s chip sends out radar signals and processes the informatio­n that bounces back from the movement of your fingers.
SOURCE: Google TORONTO STAR GRAPHIC Google’s chip sends out radar signals and processes the informatio­n that bounces back from the movement of your fingers.
 ?? GOOGLE/UNIVERSITY OF ST. ANDREWS ?? The Soli chip has four transmitti­ng units and two receiving units.
GOOGLE/UNIVERSITY OF ST. ANDREWS The Soli chip has four transmitti­ng units and two receiving units.

Newspapers in English

Newspapers from Canada