Google’s the latest big tech player going in on agentic shopping experiences.
The technology giant announced Tuesday that, in the coming months, it will launch two new AI-powered shopping features to the general public. One feature, called AI Mode, has already been tested in Google Search Labs — a platform that effectively allows Google to beta test new features with interested users. The function allows a user to have a conversational interaction with Google’s interface when looking for a specific product.
For instance, a user might tell Google, “I’m looking for a casual summer dress to wear on vacation with friends.” From there, AI Mode, powered by Gemini 2.5, shows the user a grid of options that match their query, based on consumer instruction and intent, paired with merchant product availability.
You May Also Like

If the user isn’t fully satisfied with the options AI Mode pulls for them, they can provide further context; for instance, the shopper looking for a summer dress might further refine their search by telling Google their upcoming vacation is to Puerto Rico. Instead of suggesting dresses with long mesh sleeves, which might be suitable for summer in the northeastern U.S., AI Mode will likely adjust, offering up options more appropriate for high temperatures — a sleeveless midi dress, for instance.
Lilian Rincon, vice president of consumer shopping product for Google, said AI Mode may serve users sponsored content from advertisers.
“We are going to be experimenting with ads across all of these properties, in the same way that we’ve brought ads to AI Overviews over the last year,” Rincon said. “Expect that advertisers can participate, and we’ll have more to share on that later in the year.”
That approach differs from the route OpenAI has taken with its AI-powered shopping experience, launched late last month. At the time of the launch, the company told WWD’s sister publication Sourcing Journal it would not allow sponsored content in the initial rollout of its shopping tool.
That a user can have a conversation with AI Mode, powered by generative AI, doesn’t inherently make it agentic. But Google wants to take the experience one step further. Once a consumer finds a product they have interest in purchasing, they will be able to put a price tracker on it. That is to say, if a user sees a dress retailing for $250, but they only want to purchase it if the price falls at or below $175, they can instruct Google to monitor that item.
Rincon said that this type of automation removes friction for a consumer; rather than checking back on a product every couple of days, consumers can rely on the technology to check across merchants for them in real time.
“Part of the brilliance of this agent is that it will monitor the world for that product across different merchants and different stores, and actually notify you…that we found the exact product, the right size, the right color, at the price point — or below — that you want it,” Rincon said. “That’s part of the value prop of this agentic technology, is that it can monitor the world for you.”
If the item’s price falls at or below the consumer’s desired spending limit, Google will send the user a notification with a “buy for me” option. If the user selects that option, Google confirms the credit card and shipping address associated with that user’s account, then proceeds to autonomously purchase the item using Google Pay.
To start, only merchants that have Google Pay enabled will be eligible for agentic purchasing.
Rincon said allowing a user to opt into the purchase keeps the shopper in the loop for decisioning, while AI does the behind-the-scenes work of buying the item.
“Basically what the agent is doing is…going to the merchant’s page, putting that item in your cart with the exact variants that you chose and then using your information to essentially check out on your behalf and make sure that that product is delivered to the address that you confirmed,” she explained.

While that could prove convenient for the consumer, the experience lacks a few hallmarks typical of today’s e-commerce environment. All transactions Google’s AI makes on behalf of a consumer are done without logging into a brand or retailer’s site. That makes it more difficult for the merchant to capture data on the shopper, while also preventing the shopper from collecting or using any loyalty program benefits. Rincon said the company hopes to upgrade that piece of the shopping experience in the future.
What’s more, if a shopper isn’t actually visiting a brand or retailer’s site when an AI agent makes a purchase on their behalf, the retailer may lose some incidental sales a human would consider. The agent, on the other hand, does a predefined job for the consumer it’s tasked with serving.
Rincon said the price-tracking tool, in particular, could help merchants earn sales they otherwise wouldn’t have made.
AI Mode and the price-tracking and agentic capabilities that go along with it will be available to consumers later this year, Google said.
But the company also had news to share on a more immediate upgrade to users’ shopping experience.
Google launched an updated version of its existing virtual try-on tool Tuesday; while it has already used synthetic, digitally generated models to help consumers get an idea of how an item might look, consumers can now better understand how a specific item would look on a digital rendering of themself.
“Two years ago, we introduced virtual try-on with models that allows you to choose a product, choose a model that represents you and then see what that product will look like on a model,” Rincon said. “We’ve heard a lot from consumers — and also from merchants — that the thing that everyone really wants…is to be able to try something on yourself.”
To engage with the tool, users need to submit a well-lit, full-body photo. The system serves up better results if the user is wearing formfitting clothing in the initial photo, since it gives Google a better idea of what the consumer’s body shape is.
Rincon said the idea behind the tool is to help consumers visualize how an item might look on their body; however, Google does not currently recommend specific sizing based on the image a consumer provides.
“[Fit] is the ultimate challenge, and definitely where we want to head toward, but we are starting with visualizing what the product will look like on you,” Rincon said.

The virtual try-on function, which is now rolled out in Google Search Labs, does not yet allow consumers to digitally try on multiple items simultaneously. So, if a user showed interest in a pair of pants and a T-shirt, their avatar would be able to try each item separately, but is not yet able to show the two items together.
Categories eligible for Google’s virtual try-on include dresses, skirts, shirts and pants; the company does not currently enable users to view accessories.
“We are really using the best of the Gemini models and AI to bring shopping to this next era,” Rincon said.