Google search gets ‘Shop the Look’, enters 3D sneaker images – WWD
Google’s goals for the fashion shopping experience become visible.
At its annual “Search On” event on Wednesday, the Mountain View, Calif.-based technology company launched several updates for search and product discovery in its mobile app that go to the heart of the apparel and footwear industry. shoes online. The company has launched pilot programs that incorporate 3D product images for sneakers into search results, as well as a new “Shop the Look” feature for apparel, personalized results and other updates to help US consumers feel more confident when shopping.
In total, the company unveiled nine new tests and tools with the stated aim of making shopping more visual and immersive, while helping consumers stay better informed. It’s the search engine doing what it does best – chopping and slicing its huge slice of data in new ways and flexing its considerable machine learning muscle to roll out new features. In other words, it’s the “peak of Google”, and that points straight to fashion.
One of the signs of Google’s high priority on shopping in general and fashion in particular is location: features have been designed for the main search engine, not pushed aside in the Google Shopping section. This includes a remarkable new pilot program to populate results with 3D product images for sneakers.
Think of it as a follow-up to the platform’s 3D home products, introduced earlier this year. According to Google, people interacted with these 3D images nearly 50% more than still photos, so it’s eager to expand virtual imagery to other areas, starting with the feet.
People will be able to experience more realistic sneakers, zoom in on details and rotate them to see all angles before purchasing. The test is currently limited to Vans and a handful of other brands, but later this year all sneaker companies will be able to add their own assets.
Of course, not all brands or merchants have 3D images or the resources to create them. To remove this obstacle, Google has developed a new tool that uses machine learning to make images “turnable” by stitching together standard 2D merchandise images. Another limited pilot program is underway to test this tool through the Google Manufacturer Center, as detailed on a help page in the support section.
But that’s only part of the new experience, according to Google, and it all starts with a simple change in user behavior: In the United States, consumers start by simply typing the word “shop” next to the name of the product or keywords.
From there, they’ll see results displayed with “a visual feed of nearby products, search tools, and inventory related to that product,” Lilian Rincon, senior product manager at Google, wrote in a blog post. “We’re also extending the shoppable search experience to all categories and more regions on mobile (and soon on desktop).”
As Rincon explained in an interview with WWD, the results include a shoppable display showcasing products, lifestyle images, guides and more from a wide range of retailers and brands.
“One of the new tools, which we’re calling Shop the Look, is helping people put together the perfect outfit,” she said, citing an example of bomber jacket research. The results would show photos of different styles for the item itself, as well as complementary pieces and where to buy them right in the search. It’s like Google’s version of styling services, except it’s not necessarily based on the tracks or talents of human tastemakers. It is informed by data.
Shop the Look and other features rely on machine learning, in particular Google’s Shopping Graph. The AI-based model ingests data from the internet or provided by merchant partners, and in the past year alone its understanding of product listings has grown from 24 billion to over 35 billion listings, Rincon said.
The listings, plus what and how consumers are looking, power Shop the Look, along with a new Trending Products feature launching in the US this fall.
Buy Look and Trends Now join several other new features including Page Insights – which conjures up more information when visiting a web page based on featured products, such as pros and cons or star ratings – a buying guide that explores different considerations when evaluating the product, opt-in offers or price drop alerts and, for consumers who buy with Google, personalized shopping results based on their preferences and buying habits. They can change details like favorite brands and department stores, or turn off personalization if they don’t want the feature.
Another update brings dynamic all-page shopping filters that change based on trends.
“For example, when I’m shopping for jeans, I might see filters for wide legs and boots, because those are the denim sales that are popular right now,” Rincon explained. “And if jeggings ever come back into fashion, it might be suggested as a filter in the future.”
A new Discover feature in the Google app will also start suggesting styles based on what the individual and other consumers have searched for and purchased. “If you like vintage styles, you’ll see a popular vintage looks query suggestion,” she added. “And then you can tap on whatever catches your eye and use the lens to see where to buy it.”
The changes look intriguing, but in the end they won’t amount to much if no one uses them. This is why their visibility and access to the main Google search page is important – however, it also raises the question of whether there is value in a dedicated Google Shopping page. Whatever its fate, it’s clear that shopping isn’t just a sideshow for the search industry. This is the main attraction, and probably a strategic move to maximize recent market trends.
Search dominance is an existential question for Google as its main revenue source, but it has seen more and more consumers start their product search on Amazon. According to e-commerce software developer Jungle Scout, in the second quarter more than half of online consumers, at 61%, started their product searches on the e-commerce giant’s site.
While impressive on the face of it, the data actually shows a decline from the 74% noted in the first quarter of 2021. While Amazon has seen some attrition, the broader search engine category has held steady at 49%. This could look like an opening for Google to gain traction. By making shopping more visual, it is building on investments in e-commerce – an area that has paid off for CEO Sundar Pichai, as he told analysts during the second quarter earnings call. quarter of parent company Alphabet.
“People shop on Google over a billion times every day,” Pichai said. “We see hundreds of millions of shopping searches on Google Images every month.”
So far, the company has had success with visual purchases in places like Japan and India, according to Rincon. Adding virtual images makes sense, given the high engagement rate compared to 2D visuals, to speed up traction even further. It also aligns with the organization’s other initiatives, which span ads and purchases from YouTube to Google Search, integrating experiences like augmented reality makeup try-ons and virtual furniture into habits. current purchases. Now, he’s keen to do the same with 3D sneakers — and he won’t let obstacles, like the lack of visuals, get in the way.
“Our new ML model takes just a handful of photos and creates a compelling 3D representation of an object, in this case the shoe. This new model relies on the Neural Radiation Field, or NeRF, which is a founding article in which we collaborated [on] with UC Berkeley and UC San Diego,” Rincon said. NeRF is a neural network that can, essentially, use ML to fill the visual gaps between 2D photos to create 3D images.
Rincon thinks technology is a game-changer for small brands and merchants, and she’s not alone. Forma has developed similar technology, which has fueled partnerships with gamers from Bold Metrics to Snapchat, and even Apple has taken the plunge with Object Capture, a developer tool announced in 2021 that uses photogrammetry to make image conversion work easier. 2D to 3D objects. Amazon also supports virtual showrooms and shopping environments, through its partnership with Adobe, as well as AR features for virtualized products in its marketplace.
Although the 3D images created by Google cannot be exported or used outside of the platform, at least for now, the effort could go far in consolidating virtual purchases as a fundamental consumer behavior. Currently, the approach applies to real-world goods, but there are implications beyond the physical world as well. To be clear, this move isn’t exactly a metaverse strategy. But it seems related, perhaps as something adjacent – at least in potential, if not in reality.
“Things that are sister to these types of experiences are not only [about] view 3D assets on their own, but also switch to augmented reality, right? Rincon said. “So something you can imagine is looking down the street and, you know, a 3D shoe and then having a way to try it on yourself and see, with your camera, what it looks like on your feet.”
Virtual shoe try-ons have been available for years, but not in the same place where people search for everything else, where they can shape search results and attach other data to inspire complementary choices, pick outfits pieced together on the website.
Whatever the reality, physical or virtual, Google’s ambitions seem to swing into fashion, so much so that it’s now even diving into the realm of style. But as Stitch Fix, Amazon, and other tech platforms that employ human stylists can attest, it takes more than just data to style people. The science behind it has moved on by leaps and bounds, but there’s art too – at least in the right hands – and it’s not at all clear if Google has the chops for it. Soon, buyers will be able to judge for themselves.