USA TODAY International Edition

Here’s one way Google sees search changing for you

Lens makes your camera how you find info

- Jessica Guynn

SAN FRANCISCO Google thinks smartphone technology should be, well, smarter and do more of the work for us.

The search engine giant is rolling out Google Lens as a preview with its new Pixel phones. Pixel users will be the first to try Google Lens in Google Photos and the Google Assistant. It will come to other devices “in time,” the company says.

Instead of searching the Internet with words, you will be able to search the world with photos. Google Lens turns your smartphone camera into a search engine. You point the camera at something, and Google figures out what it is, whether it’s a photo from a family vacation five years ago or a painting hanging on the wall.

It’s a new frontier in search, creating an Internet search box that hovers over the real world. Spot a flyer for piano lessons on a telephone pole? Google Lens can grab the email address and shoot off a note. Can’t decide whether to watch “Wonder Woman” on Friday night? Point the camera at the screen, and ask, “Is this movie worth watching?” Ditto for that new book from Zadie Smith.

The Lens feature is part of Google’s big push into an “AIfirst” world being led by chief executive Sundar Pichai.

At the heart of Pichai’s vision is the belief that we are increasing­ly moving toward a world that runs on artificial intelligen­ce, meaning no matter what screen we are interactin­g with — a smartphone or a smart-home device — we will be helped by the invisible hands of smart machines that answer our questions and help us complete everyday tasks.

It’s a big leap forward from the days of typing a string of words into the Google search engine, allowing the Internet giant to show lucrative search ads. Now Google is competing with other tech giants to assist consumers in their everyday lives.

Visual search with Lens, like voice search, is one way Google is adapting to how people want to retrieve informatio­n and complete tasks.

“In an AI-first world, I believe computers should adapt to how people live their lives rather than people having to adapt to computers,” Pichai said earlier this month.

Google first showed off Lens at its I/O conference for software developers in May. At the time, the use case that drew the most applause was the one that showed how Lens can help with a common and frustratin­g task: logging into your Wi-Fi network. With Lens, you can take a picture of the sticker on your router that has the name of the network and the password, and your phone will automatica­lly connect to it.

Other tech companies have developed visual search features, such as Samsung’s Bixby Vision, Amazon’s Firefly and Pinterest’s Lens.

How Google Lens works: It’s built into Google Photos and Google Assistant. Eventually you will see a Lens button in the Google Photos and Google Assistant apps. Tap on the Lens icon, and it will summon informatio­n for you.

“The really cool thing about Lens is that it represents a way to interact with the real world that we really haven’t had a chance to do before from a search perspectiv­e,” said Gartner analyst Brian Blau.

 ?? ELIJAH NOUVELAGE, AFP/GETTY IMAGES ?? Aparna Chennaprag­ada, senior director of product at Google Inc., talks about Google Lens, an ambitious new AI-centered app, at a product launch event Oct. 4.
ELIJAH NOUVELAGE, AFP/GETTY IMAGES Aparna Chennaprag­ada, senior director of product at Google Inc., talks about Google Lens, an ambitious new AI-centered app, at a product launch event Oct. 4.

Newspapers in English

Newspapers from United States