WWD Digital Daily

Shoppers Can Now Teach Google What They Like and See More of It

● Google amps up personaliz­ed shopping with new updates letting users rate product search results, fave brands, visualize with AI-image generation, and more.

- BY ADRIANA LEE

Connecting people

with products is a lucrative game, and a tricky one to get right. Even Google, the master of search with an enormous Shopping Graph of retail data, realizes it can't do it alone. So the company is enlisting the help of users.

On Wednesday, the tech giant revealed a set of updates that allow people to rate styles in their product search results and mark their favorite brands, effectivel­y teaching Google what they like so they can see more of it. Together with its AI-generated shopping images and virtual apparel try-on tool, the company's commerce objectives are clearly zeroing in on fashion.

The new “Style Recommenda­tions” feature allows individual­s to rate their product search results, according to

Sean Scott, vice president and general manager of consumer shopping at Google. Explaining how it works, he said, “when you search for certain apparel, shoes or accessorie­s items — like ‘straw tote bags' or ‘men's polo shirts' — you'll see a section labeled ‘style recommenda­tions.' There, you can rate options with a thumbs-up or thumbs-down, or a simple swipe right or left, and instantly see personaliz­ed results.”

The format may feel familiar to Stitch

Fix users. Its Style Shuffle game uses a similar thumbs-up/thumbs-down rating mechanism, so it can learn tastes or fashion preference­s. What's interestin­g about these scenarios is that both Google and Stitch Fix boast about their AI and machine learning capabiliti­es, but still take in user feedback to boost the relevancy of their recommenda­tions.

Getting consumers involved in their own preference modeling isn't a bad idea. If done right, it can elevate transparen­cy, and if people don't like the feature,

Google said, they can shut it off by clicking the three dots next to the "Get style recommenda­tions" section or dig into personaliz­ation options in the “About this result” panel.

Like others, Google also seems to understand a pivotal fact about shopping for fashion: It's an inherently visual task that historical­ly hinged on text descriptio­ns. Using images to search for something visual makes far more sense, and advances in computer vision make that feasible — which is why a growing number of platforms have been investing in that, most recently Samsung in its latest Galaxy smartphone. But this usually requires the user to photograph the item or have an existing image of it.

That seems almost quaint now in the age of generative AI. Google seems to think so too, since it created an experiment­al AI-based visualizat­ion tool that can power product searches.

“We developed AI-powered image generation for shopping so you can shop for apparel styles similar to whatever you had in mind,” said Scott. Users simply tell the system what they're looking for, and the feature creates a realistic image of it and then finds items that best match the picture.

The tool plugs into its Shopping Graph, a data set of product and seller informatio­n packed with more than 45 billion listings that are continuous­ly refreshed — to the tune of more than 2 billion listing updates every hour.

It's not a polished release, but an experiment­al feature. Anyone in the U.S. can try it out, but they must have opted into Search Generative Experience in Search Labs, Google's testing bed, to access it in the Google app or mobile browsers.

Once people find the look they seek, the next step, according to Google, is to visualize what it might look like when worn. This is where its virtual apparel try-on steps in.

Announced in June 2023, this feature uses AI to digitally place clothes on human models. “Sixty-eight percent of online shoppers agree that it's hard to know what a clothing item will look like on them before you actually get it, and 42 percent of online shoppers don't feel represente­d by the images of the models that they see,” Lillian Rincon, Google's senior director of product, told WWD at the time.

To remedy that, the company shot a diverse range of people, then used AI — fed by data from its Shopping Graph — to realistica­lly depict how fabrics would crease, cling or drape on different figures.

It doesn't work for every brand, product or clothing category, at least not yet. For now, U.S. users can find it on desktop, mobile and the Google app by looking for the “try-on” icon in shopping results for men's or women's tops and see if it wears well in sizes from XXS-4XL.

Few things can sink a shopping experience like picking through piles of erroneous items, which still happens, even with recent breakthrou­ghs in machine learning. That must be very apparent to Google, which sees people shopping more than a billion times per day. The fact that it aims to solve the issue with a blend of user feedback and AI features suggests that, even as the tech advances, it can't completely replace the human element.

 ?? ?? Google allows people to rate styles in product search results, favorite brands and use AI features.
Google allows people to rate styles in product search results, favorite brands and use AI features.

Newspapers in English

Newspapers from United States