USA TODAY International Edition
Google ushers in ‘ ambient’ age
NEW YORK — It’s not the computers you can see that are going to matter most, it’s the one you can’t. At least, that’s the argument Google made at their Made by Google hardware launch event here, as they unveiled a number of products that provide intelligence or interactions in ways that blend in with the environment around us.
The tech industry has been talking about this notion of “ambient computing” for some time, but it has taken advances in areas such as artificial intelligence, cloud- based services, and wireless connectivity to start to make it real.
To be clear, the kinds of things Google unveiled at their event – from the widely expected Pixel 4 smartphone, to Pixel Bud earbuds, and updated versions of their Nest mini smart speaker ( previously Google Home Mini) and Next WiFi ( formerly Google WiFi) mesh routing system – have not reached cloak of invisibility- level powers. However, the refinements the company added to these products, in conjunction with the software advancements in Android and the Google Assistant, are making it easier to get access to the kinds of information we expect from our computing devices in more natural ways.
As an example, one of the most intriguing features of the Pixel 4 ( which
starts at $ 699 and, for the first time, is available from all major U. S. carriers) is its Motion Sense gesture- based features. Motion Sense provides a way to interact with your device without having to touch it. Powered by a Google- designed wireless chip called Soli that creates a radar- like field around the phone, Motion Sense gives the Pixel 4 a better sense of its surroundings and context that translates into some interesting new features.
For instance, you can get basic command and control functions over the phone by moving your hands through the 12- to- 18- inch field to do things such as answer a call, mute the ringer or other notifications, advance music tracks and more. In addition, the feature can be used for more subtle capabilities including providing faster face detection or turn off the display to lengthen battery life when it can detect no one is there.
While some may argue that you can easily do similar things with voice commands, there are situations where using your voice isn’t appropriate, so the gestures represent a new type of user interface. Google engineers suggested future capabilities will allow the gestures to be used in conjunction with voice to provide the same kind of additional meaning that gestures provide in face- to- face human conversations.
On the new $ 49 Nest Mini smart speaker, the “ambient” capabilities are the result of integrating a new AI chip inside the device. This makes the Nest Mini capable of performing the kinds of Google Assistant speech recognition and other capabilities directly on the device, without having to use the cloud.
While that may seem unimportant, this on- device AI has several real- world benefits including faster response time and enhanced privacy because none of the conversations are sent to the cloud. Over time, expect to see more devices incorporate this ability and, in the process, offer more of the always- on, always- available computing resources that ambient computing implies.
One of the most interesting applications of ambient computing applications that Google showed at the event won’t be available until next spring when they release the new wireless Pixel buds ($ 179). The earbuds provide up to five hours of wireless access to the Google Assistant from as far away as 100 yards from a connected phone – essentially turning them into a wearable computing device that can be integrated into most any environment.
The concept of ambient computing may seem science- fiction- like to many; and, even with these announcements, it would be hard to say we’ve really entered an entirely new computing era. Yet what these developments do make clear is that future advancements in technology may not take the more visible paths that we’ve seen up until now.
Many of the most interesting tech products and services are going to be harder to see.