Skip to content

Google Lens receives Assistant support and wider release

Last year, Google announced the launch of its new Lens feature, designed to not only provide information about an image, but connect it to real-world information. The exciting news is that the feature, initially limited to Google’s own Pixel smartphones, is now being rolled out to most Android users in an update to the Photos app. iPhone users will also receive Lens at a later date.

At the time I mentioned that for people who are blinder vision impaired, the Lens feature has the potential to provide significant benefits. While there are several effective apps available on mobile devices that can deliver image recognition and OCR capabilities, Lens has the additional benefit of connecting the image with meaningful data that is likely to be useful while the user is in that specific location. For example, a blind user could take a photo of a café and not only have the café itself identified, but a menu could be provided at the same time along with the accessibility of the building.

In addition, the feature is being added to the Google Assistant. According to an article by Android Police, “Lens in Google Photos will soon be available to all English-language users on both Android and iOS. You’ll be able to scan your photos for landmarks and objects, no matter what platform you use. In addition, Lens in Assistant will start rolling out to “compatible flagship devices” over the coming weeks. The company says it will add support for more devices as time goes on.”

With the Google Assistant also receiving the ability to use Lens, it will make the feature much easier for people with vision-related disabilities to simply speak to theirphone to identify their surroundings. Additional information on the feature can be found at Google’s Lens information page.

Published inNews