Google Lens is now available in the Google search app for the iPhone. Previously iOS users could access Lens visual search but only through the Google Photos app, which required taking a picture and then running Lens. It was very awkward.
In the search bar. Now Lens appears in the search bar next to the mic icon and is available with a simple touch. That enables visual search of products and objects, buildings and places, plants and animals, QR codes, barcodes, business cards and virtually anything featuring text. Lens currently supports English, Spanish, French, German, Italian, Portuguese and Korean.
Lens gets a B. In my previous tests of Lens on photos and on a Google Pixel phone, Lens performed relatively well — I’d give it a “B.” On products, books and media it performs about as well as Amazon’s visual search today. But it outperforms the latter overall by having a broader range of object recognition capabilities.
— Google (@Google) December 10, 2018
Using my iPhone in my kitchen this morning, Lens got about 75 percent of object and text searches right. And now that Lens is a clear search option, we’ll start to see more people use it. That should further improve its image recognition capabilities.
Why you should care. Though it remains to be seen how widely adopted Lens is, it could become very popular, especially for objects and products. But it might also become popular as a way to check reviews for restaurants and other places as you’re out in the world — an augmented reality use case.
So far, there’s no search optimization strategy for visual search (at least right now) like there is for images. But what about ads? In a product search context we could easily imagine the eventual inclusion of shopping ads.
The larger point, however, is that search on mobile devices is going to further diversify. And visual search will likely earn its place beside voice as an alternative to text-query input.