Netcore Unbxd Launches Agentic Multimodal Search with Voice Capabilities
Netcore Unbxd, a provider of product discovery solutions, has launched Agentic Multimodal Search to help e-commerce systems interpret shopper intent by understanding images alongside natural language input, whether typed or voice-enabled, within a single search experience.
"As commerce becomes more visual and AI-led, shoppers shouldn't have to translate intent into rigid search terms," said Rav Shankar Mishra, product and conversational director at Netcore Unbxd, in a statement. "Agentic multimodal search allows the teams to understand how shoppers see products and how they describe them, combining visual cues with language-based refinement in real-time."
Rather than processing inputs independently, the system evaluates visual signals and language signals together to form a cohesive understanding of shopper intent. Visual inputs anchor aesthetic context, while language introduces constraints and preferences. Results are then ranked using a combination of product popularity, user behaviour, geo-location, freshness, semantic understanding, and relevance signals.
"Visual search answers what looks similar," said Nishant Jain, chief operating officer of Netcore Unbxd, in a statement. "Agentic multimodal search enables retailers to surface what aligns with the shopper's intent by understanding both visual inspiration and descriptive context together."
"Search is becoming the first agent in the commerce stack," added Nishant Arora, senior vice president of marketing at Netcore, in a statement. "The ability to understand visual and language intent together is becoming essential as commerce experiences grow more dynamic and AI-enabled."