Skip to content

Author: Scott Hollier

Google introduces live captioning and Lens improvements to Android Q

Google announced at its 2019 I/O developer conference that the upcoming version of Android, currently codenamed ‘Android Q’, will feature some significant accessibility improvements relating to the automated captioning of video and the addition of search to Google Lens.

Live Captioning

The Live Caption feature will allow users to download a video to their device and play it back with captions regardless as to whether the video was formally captioned or not. This makes use of similar technologies currently found in YouTube’s automated captioned service whereby Google scans a video and adds captions for you. The main difference here is that the ability to scan a video is built into Android Q, and the process appears to be relatively instantaneous once a video is downloaded to a device.

While the Live Captioning feature is focused primarily on pre-recorded videos, it has also been demonstrated with real-time video calls. This has the potential to improve the communication options for people who are Deaf or hearing impaired worldwide. The following YouTube video showcases the feature in action.

While the promise of every video featuring captions and even live calling is extremely exciting for people who are Deaf or hearing impaired, there are currently few options to test the feature at this time of writing outside of specific beta testing programmes. There is also some scepticism about its accuracy given that the effectiveness of the YouTube automated captions feature relies heavily on broadcast-level audio quality and only caters for a limited number of languages and accents.

While the feature is primarily focused on people with a hearing disability, it is likely to have wider benefits for people wanting to watch video content in noisy environments such as on a bus or plane.

In terms of availability, people using Pixel and recent mobile devices affiliated with the Android One programme are likely to receive the update before the end of the year. Once a device is updated to Android Q, the feature can be enabled in the device settings.

Google Lens

Another accessibility-related improvement is an update related to Google Lens. Google has incorporated search and some additional real-time functionality to help people interact with your environment by taking a photo.

According to Natt Garun from The Verge, “Google says Lens can search for exact dishes on a menu and surface photos of that dish based on Google Maps information to show you just how it looks before you order. You can also point the camera at the receipt to bring up a calculator that lets you add a tip then split the bill or at a sign in a foreign language to hear a text-to-speech translation.”

Google Lens remains a popular feature in Android for people who are blind or vision impaired as it allows for a person with a vision disability to take a photo and find out what is in the surrounding environment. The added functionality is likely to continue making Google Lens more useful.

Accessibility features aside, the one remaining mystery about Android Q is its name. Google traditionally names its android releases after sweet treats and in alphabetical order, but as there aren’t many desserts that start with ‘Q’ it will be interesting to see what choice Google makes.

For additional information on Google 2019 I/O announcements, visit The Verge website article.

Google Android apps round-up for people who are blind or vision impaired

There’s no denying that Apple led the way when it came to mobile accessibility for people who are blind or vision impaired. In 2009 when Apple released it’s iPhone 3GS, its integrated VoiceOver screen reader was revolutionary in making mobile touchscreen devices accessible. Over the past 10 years, competitors have gradually caught up meaning that both Apple iOS based devices and Google Android devices are largely comparable in terms of accessibility features out-of-the-box for people who are blind or vision impaired.

So why is it then that blind and vision impaired Android users still feel inferior when standing next to a person with an iPhone? It largely comes down to the apps. With Microsoft putting a lot of its accessibility emphasis into the Apple platform with apps such as Seeing AI and Soundscape, it often feels like Android users with vision-related disabilities are shut out of the benefits mobile devices can provide.

Yet it may surprise many to learn that the world’s most popular mobile operating system has a lot to offer people who are blind, or vision impaired with an assortment of useful apps and features. That’s not to take way from the iPhone, which is certainly a great device, but for those on a budget, here’s some guidance on what blind and vision impaired users can find if you are a Google Android user.

BUILT-IN FEATURES

TalkBack screen reader

First up it’s important to acknowledge that Google Android has a great built-in screen reader called TalkBack. On most Android devices this can be found by going into Settings then Accessibility. If TalkBack is not there, you can usually install it by downloading the Android Accessibility Suite in the Play Store which adds several great accessibility features.

Select to speak

Rather than turning on TalkBack, you can just select text, and have it read out which can be very convenient for people who just need some text read out occasionally.

Magnification

This is a feature that lets you zoom into a portion of the screen similar to a magnification tool on a desktop computer. This can be very useful for people with low vision who need to see a larger part of the screen.

Colour correction

For people with a colour vision impairment, Android has a number of features which that allow the user to modify the colour palette so everything on the screen can be seen.

Volume key shortcut

If you share a phone with others, you can quickly toggle an assistive technology feature such as TalkBack on or off by holding the two volume keys together for a few seconds. For example, you may want to turn off the screen reader when using a Camera app, then re-enable it after a photo is taken.

APPS

While Android is yet to receive some of the great Microsoft apps available on iOS, there are a number of Android apps which have some similar features and work really well.

Speak!

Speak! screenshot

Earlier in the year I was contacted by an Israeli engineer that developed and launched the free app Speak! This app is mainly designed to read out text with an auto-read feature so you can move the phone around and it’ll keep reading whatever text it finds. It also has useful text orientation features and works well if you want to read whole pages from books or menus. While the app may not be as polished as some similar apps, it meets two very important criteria – it works, and it’s free. I’d strongly recommend that every blind or vision impaired Android user download this app as soon as possible.

Eye-D

Eye-D screenshot

Another favourite app is Eye-D. There’s a free version available but I’d recommend paying for the Pro version which is very reasonably priced at $AUD6.99. Eye-D is a ‘swiss army knife’-style app which has a large number of tools including an accessible camera, a ‘where am I’ feature, the ability to find out what services are nearby and makes use of Google Maps to navigate you there, the ability to identify images and text in images just to name a few. Given its large number of features it’s another app worth considering if you have a vision impairment.

Envision AI

Heavily inspired by Microsoft’s Seeing AI app on iOS Envision AI also features a variety of tools relating to image and text recognition, it also has the ability to read out handwriting and scan barcodes. I used the app during its free demo period, and it worked very well. However, while Seeing AI on iOS is free, Envision AI are charging a hefty $AUD221.99 for a lifetime subscription which is apparently a discount price. For this amount you may be better off paying for an iPhone and then using the free apps available instead if this functionality is critical to your needs or consider the previous two apps which are far more reasonably priced.

KNFB Reader

A well-established app that can also help in reading out text in books and menus is the KNFB Reader. As with the above it also has a demo mode after which the price of purchase is $AUD159.99. Again, there’s no denying the effectiveness of the app, but I’d certainly recommend trying the free options first.

Magnification using the camera

In addition to speaking apps, there are many magnifier apps that can use your phone’s camera to view print up-close and inverse the colours of needed. There’s no one app to recommend as there are lots of good free options on the Play Store, but it’s worth spending some time trying them out to find out what works best for you.  

Google Lens

If you want an app that can describe your environment, Google Lens is a great option. It can be used both as a standalone app or as part of most Android camera apps, Lens allows you to take a photo and then have the scene described to you. Google is working on integrating search functionality and other helpful features so if you want to have Lense, have a look at your phone’s camera app and see if it is supported.

Uber Eats

It’s fair to say that Uber Eats is not an app designed specifically to assist people who are blind or vision impaired, but it is an important one to note for people that have the service in their area. Uber Eats is a food delivery app and unlike most websites and apps associated with food delivery, Uber Eats is completely accessible with the TalkBack screen reader. This means that you can confidently select a food outlet, choose items form the menu, pay and keep up-to-date with the delivery status without any accessibility issues. Given the challenges in trying to go out for food when you can’t drive, getting the food delivered to you in a convenient and accessible way can be a very positive experience for people who are blind or vision impaired.

Launchers

BIG Launcher screenshot

One of the best things about Android is that unlike iOS, your launcher can be replaced with one that better meets your needs. There’s a large number of custom free launchers available with Nova Launcher being one very accessible example. My personal favourite is BIG Launcher which works well for my needs by simplifying the interface making use of a high contrast theme and working well with TalkBack. It also allows me to customise the buttons such as putting my news app on the home screen. However, at this time of writing, the full version has been removed from the Play Store due to some issues with Google’s new policies on using the app for messaging. Hopefully it’ll return soon but in the meantime you can try out the demo.

ANDROID V IOS – WHICH IS BETTER?

If a direct comparison were made between the latest version of Android and iOS for a person who is blind or vision impaired, I’d say that Apple still wins out due to iOS being so well established in the community and the wealth of app options. However, if you’re looking for accessibility on a budget, it’s still worth considering Android due to it’s great out-of-the-box experience and some great free apps – and lunch thanks to Uber Eats!

Call for online disability access standards for computers from Equal Opportunity Commission: ABC

The ABC recently published a news item on its website highlighting the challenges faced by people who are blind or vision impaired and the underreporting of web accessibility issues.

In the article written by Herlyn Kaur titled Call for online disability access standards for computers from Equal Opportunity Commission, The key points highlighted in the article are:

  • Navigating websites can be a struggle for Australians with a vision impairment
  • Australian laws prohibit discriminating against a disability when providing goods or services
  • WA’s equal opportunity commissioner has called for mandatory, enforceable standards

The article discusses the experience of screen reader user Siyat Abdi and the challenges faced in trying to use websites as a blind user on a daily basis. In response, WA Equal Opportunity Commissioner John Byrne said the issue was not isolated to people with a visual impairment and explained how the limitations of the Disability Discrimination Act of 1992 (DDA) may be a factor. Essentially the DDA  does provide a means to lodge a complaint, but complaints are few and this is likely due to a lack of specific ICT standards in the DDA.

Regular readers of this website will be aware of my personal call, among the many that work in this space, to Fix the DDA and the need to legislate against digital access discrimination in Australia. It’s great to see the issue receiving some mainstream traction in the media.

W3C WAI improves resource translation support

The World Wide Web Consortium (W3C) Web Accessibility Initiative (WAI) has significantly improved its content so that it is much easier to locate documents such as the Web Content Accessibility Guidelines (WCAG) standard documents and associated resources in a variety of languages.

The All Translations section on the W3C website provides expandable menus based on a number of languages whereby the selection for a particular language will expand to show all current W3C WAI translations in that language.

The resource currently features categories based on the following languages:

  • Arabic
  • French
  • German
  • Greek
  • Spanish
  • Japanese
  • Dutch
  • Russian
  • Simplified Chinese

In addition, W3C WAI appears to be relaxing its strict processes relating to document translations, seeking volunteers to get involved. This will hopefully increase the number of documents supported in other languages.

While at the time of writing some language categories are placeholders such as Arabic, it is encouraging to see more effort being put into an area which has traditionally been a weak point in the provision of accessibility standards in different languages.

UWA upskills staff to support LMS digital access integration

My colleague Dr Ruchi Permvattana and I recently ran two workshops to support the University of Western Australia (UWA) in its ongoing efforts to enhance its digital access commitments.  

Chris Leighton, Dr Scott Hollier, Dr Ruchi Permvattana & Elaine Lopes standing in a row smiling

The workshops were designed to ensure that the LMS implementation and education delivery process is accessible for UWA all the way from its initial setup through to its successful delivery to students. Topics included:

  • How do people with disabilities engage with your content? This includeed a demonstration of how assistive technology users engage with content produced by UWA. There was also an opportunity to experience the content in a similar way to a blind user.
  • LMS accessibility: this topic focused on how to consider accessibility when choosing an LMS platform
  • WCAG 2.1 Level AA in-depth
  • WCAG 2.1 Level AAA: is it worth it?
  • Guidance for Unit Coordinators, staff supporting academics and administrative staff in the areas of:
    • Preparing accessible content
    • Creating teaching and learning materials
    • Delivering best practice for information resources
    • Assessing the LMS against WCAG 2.1 and automated tools

The workshops build on UWA’s commitment to digital access which has included the recent purchase of the Blackboard accessibility ‘a11y’ pack and its continued efforts to review its policies and processes to the recently released WCAG 2.1 standard.

On behalf of Ruchi and I, thank you very much to UWA for the opportunity to support your digital access needs.