US tech giant Nvidia has unveiled a new blueprint for AI-driven shopping assistants.
The new blueprint is marketed at developers who want to build digital systems that can work with and for retail employees by offering the expertise of sales associates, stylists, or designers to their customers.
Nvidia's ‘blueprints’ are reference workflows for both agentic and generative AI use cases.
During a press briefing, the company's director of product marketing for retail Cynthia Countouris said that the launch builds on two cross-industry AI blueprints announced by the company’s chief executive Jensen Huang earlier this week, describing these as the "starting blocks to develop custom agentic AI solutions".
Agentic AI is able to solve more complex and multi-step problems via iterative planning and sophisticated reasoning. GenAI conversely provides responses using natural language processing based on a single interaction.
The company's blueprint enables shopping assistants to understand text and images, while also facilitating a multi-query capability which can search for several products at a time.
The shopping assistant, for example, can be asked a general question about whether a retailer has a certain item, whilst also asking for recommendations for a shoe to go with a particular dress.
“Notice it also handles context of an ongoing conversation as well,” said Countouris. “In effect, you can ask for additional details on a recommended product such as, what is the heel height of those shoes? Or, how much can that purse hold?
“Notice I’m being very conversational, just as if I were working with a store associate or stylist. And you're delivering this through digital channels to your customers.”
In the home furnishings market, as well as facilitating furniture recommendations; product details such as fabric care instructions; and a search option to find products that are similar to an image of an item, the shopping assistant allows a customer to select furniture from a catalogue and see it represented in their own room.
This is made possible through Nvidia Omniverse, a real-time AI-based 3D graphics collaboration platform.
Talking about how the technology is able to provide a hyper-personalised experience, Countouris said: “It takes context both from how the customer is engaging right now with the store associate and the AI blueprint can very easily be extended with retrieval-augmented generation (RAG) to look at the customer's previous purchase history and engagement. It's a RAG-based model.”
In alignment with Nvidia's announcement on Friday, brand IT advisor SoftServe unveiled its genAI shopping assistant, which was developed using the new blueprint.
The company's assistant is able to facilitate virtual try-on, allowing customers to visualise how products look on them directly through an online chat before making a purchase.
Nvidia has observed several trials of the technology being used to provide interfaces for customers to virtually try on products in store, with this feature seemingly gaining more traction.
As well as SoftServe, Dell Technologies and World Wide Technology (WWT) will use the blueprint in early access.
Beyond these partnerships, the organisation is in discussions with several retailers in Europe, with official announcements set for a later date.
Looking ahead, Countouris said that agentic AI will be transformative in the retail industry.
“...agentic AI just brings it another level further in helping to provide more tools and resources to workers in retail to be able to take on new tasks, do them more quickly and more effectively,” she added. “So this expansion is going to really revolutionise what's happening in retail.”
The director believes that the adoption of agentic AI is likely to grow over the next couple of years, predicting that the technology will not just be used in the back office for e-commerce but also for other areas, such as in the supply chain supporting warehouse and distribution workers, for example by identifying conveyor system breakdowns.
Recent Stories