At the I/O 2017 conference last week, Google introduced its new and super impressive visual search tool known as Google Lens.
Google’s CEO described Lens as “a set of vision-based computing capabilities that can understand what you’re looking at and help you take action”.
In a nutshell, Lens is a way to ‘Google’ things by looking at them through your smartphone camera.
It’s kinda genius, because using our cameras in real-time is a habit that we’ve all picked up thanks to Snapchat and Instagram.
Picture this: you snap a picture of your HashtagBreakfast and – instead of adding a funky (but pointless) filter – Google shows you nutritional information, similar recipes and a blog post on the shocking truth behind that GMO buckwheat you’re eating.
Pretty awesome, right?
Here’s a picture of Lens at work in the Google Demo. In this example, Lens is providing business details for a restaurant by identifying the building’s entrance.
Google Lens is a step up from Google Goggles (which could be used to identify items in photos but not much else) and a more realistic application of Google Glasses – because wearing Google on your face is a step too far for many people.
Why Should You Care?
This is really going to shake up the search landscape for users and webmasters alike.
Previously, Google’s algorithms revolved around understanding text and webpages. The fact that they can now understand images will have profound and ongoing implications on the future of search.
This development also reiterates the ever-increasing role that smartphones are playing in search. Google has clearly been investing heavily in mobile – have you?
Lastly, Google are always bringing new developments to the search world, and this is yet another example of what they’re capable of. The trick is to try and stay on top of the news and ensure you’re flexible enough to adapt to these changes.
Google Lens will only officially roll out at a later date, so you do have some time to prepare.
For more information, watch: https://www.youtube.com/watch?v=igTtOA1jcik