Sometimes when we touch: How a microchip can identify objects
Scottish researchers use Soli chip to ‘read’ radar signatures of things touching it
Google created a new microchip to allow touchless interaction with smartphones. But researchers at the University of St. Andrews in Scotland stumbled upon an unexpected use: it can be used to identify objects that it touches.
Google developed the Soli chip so that devices could recognize fine finger gestures made a distance from the screen. The chip sends out radar signals and processes the information that bounces back from the movement of your fingers in the air, so it knows if you are miming a tap on an icon or a swipe across your screen.
Inside the chip is an antenna device with four transmitting units and two receiving units. Each transmitting unit fires up and emits a signal in a particular part of the frequency range, from 57 to 60 GHz, then shuts off and the next one turns on, and so on. When the signal hits an object, part of the signal is scattered from the front of the object, other parts are reflected from the inside and still others reflected from the back of the object. Those return signals are captured by Soli.
Prof. Aaron Quigley, chair of human computer interaction at St. Andrews, says the university was one of 40 institutions to receive a Soli alpha development kit from Google. The kit, about the size of a pack of playing cards, contains the chip and around it, circuitry and hardware to collect the data and process it.
At first, his team of two student researchers used the kit to try to measure wrist movement in the air. But when one of them placed their wrist right on the kit, they realized that if they sampled the return radar signals hundreds of times, the pattern was unique enough to identify a wrist, as well as inanimate objects.
The unique radar patterns, which can identify the surface characteristics of objects, their composition and identify rear surfaces as well, were fed into a software program they created called RadarCat, or radar categorization of objects. The result was a dictionary of objects. This is called supervised machine learning — the software program can’t figure it out on its own the first time.
But once RadarCat does learn an object, researchers say testing showed the program will identify it again — not only inanimate objects but living ones as well, like your wrist, although the program is less accurate when it comes to body parts.
With more development though, it could mean that one day your phone will know where it is on your body and change its interface accordingly, creating a louder ring tone because it’s in your pocket or larger icons because it knows you’re tapping with a gloved hand.
“What we’ve done is show the research is possible,” Quigley says.