
When Amazon wrote about its shopping app’s image-search feature last year, it demonstrated how Amazon Lens could help you find an item you see in real life “such as your friend’s magnetized fridge calendar that’s perfect for organizing you or the family” — by taking a picture of it, uploading the photo to Lens, and adding the item Lens found to your Amazon shopping cart.
Last week, it launched “Amazon Lens Live,” which makes the process even easier and integrates it with its AI feature called Rufus to help users learn more about the product. In its September 2, 2025 announcement, Amazon described key takeaways about the Amazon Lens Live enhancements:
- Lens Live instantly scans products and shows real-time matches in a swipeable carousel to make finding the right item easier.
- Lens Live integrates Amazon’s AI shopping assistant, Rufus, to offer product insights, summaries, and answer questions as you browse.
- It is now available for tens of millions of customers in the Amazon Shopping app on iOS and will roll out to more customers in the coming weeks.
So if you start seeing your friends pointing their phones in your direction and then wearing your outfits, you may have Amazon Lens to blame (or credit if you feel flattered).
Because Amazon updated its original post announcing Amazon Lens in March, it’s difficult to determine exactly when Lens launched, but we found references to it as far back as July 2024, and its mobile shopping app may have had image-search features earlier than that.
Amazon described in its September 2, 2025 post how its integration with its AI shopping tool Rufus works with Lens Live as follows:
“To help customers using Lens Live learn more about products they’re viewing, we’ve integrated our AI shopping assistant, Rufus, into the experience. While in the camera view, customers will now see suggested questions and quick summaries of what makes a product stand out. These conversational prompts and summaries appear under the product carousel, allowing customers to perform speedy research, quickly access key product insights, and get their questions answered.”
A gif in the announcement demonstrates how someone looking at a planter they found through Amazon Lens Live can see questions others have asked about the product, such as, “Does it come with a drainage hole?”; “Is the finish smooth or rough?”; and “Does it come with a saucer?” Tapping one of the questions on the screen brings back an AI-generated answer, presumably taken from Amazon content such as the product description and product reviews.
Amazon’s Director of Visual Search and Search Mita Mahadevan, who wrote about Lens on the About Amazon blog last year, left the company in January to become Vice President of Engineering at Pinterest. Trishul Chilimbi, Amazon Vice President and Distinguished Scientist, Stores Foundational AI, wrote the September 2nd post about Amazon Lens.