Google will soon be extending the reach of Google Lens, its visual search interface. In a blog pile, the company announced Lens would be integrated into the Google Friend in the coming weeks. The feature is still exclusive to Pixel phones, but now it should be a lot easier to access.
Google Lens be struck out in beta on the Google Pixel 2, which launched last month. The assistance is basically a revamp of Google Goggles—you take a picture of something, run it from one end to the other Google’s computer vision algorithms, and Google will try to tell you what’s in the drawing. Google says Lens can identify text, landmarks, and media occupies, but those were all things Goggles could do years ago. We tried Lens on the Pixel 2 at found, and while it was definitely a beta with a lot of problems, it occasionally did something imposing, like recognizing not just that a picture contained a dog, but also nailing the dog develop.
Google says Assistant integration will allow you to get «quick support with what you see.» This sounds like a big improvement over the prevailing beta of Google Lens, which is only integrated into Google Photos. Doing any congenial of recognition through the Photos app is really slow, since you have to get the camera app, aim it at something, take a picture, open the picture, and then run it to the core Lens. The new location of Lens will be a lot easier—you just open the Mate and tap on the Lens icon in the bottom right corner.
Google says the Lens-in-Assistant integration see fit be coming to «Pixel phones set to English in the US, UK, Australia, Canada, India and Singapore during the coming weeks.»