Amazon has unveiled an AI-powered Lens Live, a new visual-search feature in its Shopping app that allows users to scan items through their camera and instantly view product matches.
The feature, rolling out to tens of millions of iOS users in the US, uses AI-powered object recognition to deliver product matches in a carousel at the bottom of the screen.
Shoppers can tap to focus on a single item, add it to their cart or wish list, and access product summaries and suggested questions through Amazon’s AI assistant, Rufus.
The integration of Rufus allows for conversational prompts and instant product insights directly within the live camera view.
This move represents more than just a new feature – it’s a step toward embedding AI-driven, real-time discovery within its digital shopping.
Real-Time Product Discovery
Behind the scenes, Lens Live leverages Amazon’s cloud infrastructure, including AWS-managed OpenSearch and SageMaker, to deploy large-scale machine learning models capable of real-time product detection.
Amazon’s main goal is to streamline product discovery with minimal user input, bringing frictionless shopping closer to reality.
The launch places Amazon in direct competition with Google Lens and Pinterest Lens, which have pioneered the consumer visual search space. But Amazon holds a unique advantage: immediate integration with its retail ecosystem.
The company says visual search usage in its app grew more than 50 percent over the past year, with photo-based searches more than doubling.
Lens Live builds on existing features such as barcode scanning, photo uploads, and the recently introduced Circle to Search function for identifying multiple products in a single image.
While the rollout is initially limited to iOS users in the US, Amazon plans to expand access in the coming weeks.



