Skip to content

Dr Scott Hollier - Digital Access Specialist Posts

Outrunning the Night Vietnamese translation coming soon

I’m very excited to share that my book Outrunning the Night: a Life Journey of Disability, Determination and Joy has recently been translated into Vietnamese.

The translation project began when I visited RMIT Vietnam last year where Đỗ Đức Minh, an RMIT Vietnam employee of Wellbeing Services that supports students registered with Equitable Learning Services, asked if my book could be shared with the wider Vietnamese community through the creation of a translated version. Thanks to his efforts and the dedication of translators Lê Thị Vân Nga and Đào Thị Lệ Xuân over the past 14 months, the translation is now complete.

The next steps for the Vietnamese edition of the book will be to work on formatting the text so it can be printed as a paperback along with the development of an audio book so it can be easily read by people in Vietnam who are blind or vision impaired. Once completed the audio book, e-book and 50 copies of the paperback will be provided free of charge to the Vietnamese community.

Many thanks to Đỗ Đức Minh for the hard work and dedication in making the translation a reality, and I’m very much looking forward to shipping the paperback to Vietnam. 

On a related note, the original e-book and audio book versions of Outrunning the Night have recently been added to the Curtin University library catalogue so that it can be used as a reference for students undertaking tertiary studies. Thank you to the people that requested the book’s inclusion and the Curtin staff for taking the time to add it in.

Android 9.0 Pie accessibility hands-on

Last month Google released the latest version of its Android operating system, continuing its trend to naming its releases in alphabetical order and after desserts. This time we see Pie added to the list. While the accessibility features are more incremental when compared to Android 8.0 Oreo, users of earlier versions of Android such as KitKat, Marshmallow and Nougat should consider the upgrade.

Device used for testing

Before getting into all the specifics that can be found in Android 9.0 Pie, I should mention a little about the device I’m using for this review. While it would be great to use Pie on the latest Google Pixel range of smartphones, I’m conscious that most people are unlikely to rush out and buy the latest smartphone every time a new one comes out so instead I’ve looked at upgrading one of my old smartphones around the house to see if that process brings the same accessibility benefits. As such, for this review I’m using an old Motorola Moto G updated to Pie thanks to the development community. For this review I’m using Lineage OS with the usual Google applications installed. While there may be some additional accessibility features available in other models, this review will give you an overview as to the accessibility features consistent across different devices running Pie.

Android Accessibility Suite

One important recent change is that several accessibility features for Android are no longer on the Play Store as individual features. They are now bundled into a single app called the Android Accessibility Suite. Google explains the features as follows:

Android Accessibility Suite includes the following services

  • The TalkBack screen reader adds spoken, audible, and vibration feedback to your device.
  • Switch Access lets you control your device with a switch.
  • Select to Speak lets you select something on your screen and hear it read or described aloud.

The benefit of this download suite model is it allows Google to continually update these parts of the accessibility features and adds functionality to older versions of Android that may not have had all these features pre-installed.

Android 9.0 Pie accessibility features

There are a wealth of accessibility features contained in Android 9.0 Pie, continuing the trend of accessibility improvements with each new Android version. While there’s not as many substantial changes overall from Oreo, the feature set is quite impressive.

Android Pie accessibility features screenshot 1 of 3

Specific accessibility features include:

  • Volume key shortcut: this allows you to set your favourite accessibility feature to be quickly enabled or disabled by holding the two volume keys together. This can be very useful if you rely on an accessibility feature such as talkback but then want to hand your phone to someone such as a family member to take a photo, requiring the accessibility feature to be temporarily disabled.
  • TalkBack screen reader: this is the primary way that people who are blind or vision impaired can use their Android device. While TalkBakc has been available since Android 4.0, it’s feature set continues to grow.
  • Select To Speak: this is another feature that provides some quick text-to-speech functionality for people that just want something on the screen to be quickly read out. This is achieved by simply selecting the relevant text.
  • Switch Keys: this feature provides additional support for people with a mobility impairment by enabling a series of commands to be implemented via switch mechanisms.
  • Text-to-speech output: this allows you to adjust the screen reader’s voice, speed and language settings.

    Android Pie accessibility features screenshot 2 of 3

  • Font size: easily adjust the size of the font across the whole operating system.
  • Display size: this can scale elements in Android to make the display larger or smaller.
  • Magnification: this is a full-screen magnifier that allows you to zoom in and out of an area on the screen by triple-tapping on the screen.
  • Large mouse cursor: if you are using a mouse, you can adjust the size of the pointer
  • Remove animations: this removes the effects such as fading in or fading out windows to make the interface easier to use.
  • Dwell timing: this feature can automate certain processes such as activating a mouse click if the mouse hovers over an area for a certain period of time.
  • Power Button Ends Call: as the name suggests, pressing the power button when a phone call is taking place will end the call so there is no need to find the equivalent option on the screen.

    Android Pie accessibility features screenshot 3 of 3

  • Auto-rotate screen: this can force the device to always remain either in portrait or landscape orientation mode.
  • Touch and hold delay: prevents accidental bumping of the device by setting a certain amount of time for a touch on the screen to activate a feature.
  • Vibration: toggles the vibration feedback on or off
  • Mono audio: makes the same audio information come out of both sides of the headphones so that no information is missed if a person has a hearing impairment in one ear.
  • Captions: shows captions on the screen when available should a video be played.

In addition there are some experimental features relating to colour correction and high contrast text designed specifically for people with a colour vision disability.

Useful TalkBack features

In addition to all the features listed, there are two other things that are only in Oreo and Pie that are worth mentioning. Firstly, the audio in Oreo and Pie for accessibility features such as TalkBack can now be adjusted separately to the media volume which makes it much easier to control. Secondly, a phone call can be answered by using two fingers to swipe up the screen instead of having to find the ‘answer’ button.

These features, when combined with the Power Button Ends Call and the helpful TalkBack tips that explain things from time to time make the device much easier to use on a daily basis.

Overall if you have an Android smartphone that is running Android 7.0 Nougat or earlier, I’d strongly recommend investigating if your device can be upgraded. If your device manufacturer doesn’t have an upgrade, it may be worth searching online to see if the community have created their own upgraded version so you can get the latest accessibility features similar to what I’ve done with my old smartphone. If you are currently using Android 8.0 or 8.1 Oreo, there’s not as much on offer in Pie but it is encouraging to see Google continuing to improve accessibility in its products.

IoT education report now available on the NCSEHE website

In 2017, I was involved in a Curtin University research project titled Internet Of Things (IoT) Education Implications for Students with Disabilities. Thanks to the support of academics across four universities, the report has now been published on the National Centre for Student Equity in Higher Education (NCSEHE) website.

The six-month research project was undertaken at Curtin University to determine the significance of the Internet of Things (IoT) in a tertiary education context. The research consisted of both an analysis of the current literature — focussing on consumer-based IoT, the IoT and disability, and the IoT and education — and interviews conducted to determine the perspectives of IoT of five students with disabilities.

The report findings indicated that:

“While the deployment of this technology in higher education, particularly in relation to students with disabilities, is still in its infancy, recent developments — such as the ubiquitous availability of smartphones, improvements in consumer-based IoT engagement such as standalone digital assistants, greater affordability, as well as the ease of collecting real-time data — provide significant opportunity for IoT innovations and solutions. The potential to seamlessly link students to their learning environment — in traditional classrooms or remotely — has great promise. In addition, students access the IoT via their own devices, thereby enabling their preferred assistive technologies (AT) and their individualised settings. Nevertheless, it is also critical that issues relating to privacy, security and interoperability are also addressed within the IoT context.

While IoT in higher education is still an emerging technology, particularly in relation to access for people with disabilities, universities need to seize the opportunities presented and develop plans to both engage with, and develop, these technologies in a learning and teaching environment. They also need to ensure that these technologies are interoperable with student’s own technology, particularly AT and to address the challenges to the privacy and security for both students and staff presented by IoT technologies.”

Specific recommendations from the report are as follows:

  • “The implementation of IoT solutions should focus on the use of personal smartphones as the primary IoT interface device for students with disabilities.
  • The IoT equipment associated with learning such as a digital whiteboard should have the ability to provide its output to students via an LMS or app. This would ensure that students with disabilities can process the data with their preferred AT.
  • The use of IoT to observe students and the lecturer to enhance the effectiveness of learning materials and facilitate the implementation of improvements.
  • All IoT-related implementations will need to consider privacy, security and interoperability as highlighted by the ongoing World Wide Web Consortium (W3C) Web of Things (WoT) research.
  • Any IoT solution must be accompanied by training to ensure that all staff and students are able to use it effectively.
  • Trial of standalone digital assistants such as Google Home and related devices such as Google Chromecast should be provided to students with disabilities to assess their long-term effectiveness in improving educational outcomes.
  • The applicability of using a digital assistant as a real-time captioning device warrants further research.
  • IoT solutions for classroom environmental controls should be explored for automatic optimisation for student learning — this could be available to students via an aggregated voting system, possibly via a smartphone app.”

This report was initially released in October 2017 to support the Web of Things work at the World Wide Web Consortium (W3C) as a result, several requests were received for the report to be added to the NCSEHE whose purpose  is to inform public policy design and implementation, and institutional practice, in order to improve higher education participation and success for marginalised and disadvantaged people. The report was added to the NCSEHE in August 2018.

Special thanks to Shadi Abou-Zahra, Professor Gerard Goggin, Professor Vanessa Chang and a number of other academics whose support led to the NCSEHE submission process. Many thanks also to my co-authors for their involvement in the report. If you would like to read the report, the full text can be downloaded from the NCSEHE website or from the W3C Web Of Things publications page.

Amazon updates Echo Show to support speech and hearing impaired users

The Amazon Echo Show, a digital smartspeaker that includes a touchscreen, has received a new feature to make it easier for people that have a speech or hearing impairment to interact with the device.

The feature called ‘Tap To Alexa’ allows the user to tap the screen to interact with the Alexa digital assistant instead of using verbal commands. Once the feature is enabled, users can tap on the information they want instead of providing the equivalent verbal command.

In a recent article by Mallory Locklear at Engadget the updates were described as follows:

“The feature includes shortcuts to common Alexa items like weather, timers, news and traffic, and users can also type out Alexa commands. Additionally, while Amazon launched its Alexa captioning feature in the US a few months ago, it’s now releasing that feature to users in the UK, Germany, Japan, India, France, Canada, Australia and New Zealand.”

The following YouTube video also provides a bit more detail on how the new feature works.

The primary difference between the entry-level Amazon Echo smart speakers and the Show is the addition of a touchscreen which provides visual information in addition to audio-based feedback. As such, this makes the device much easier for people who are Deaf or hearing impaired to use as they can visually see the information. The addition of caption support also helps in mirroring the verbal interactions with text on the screen.

To enable the Tap To Alexa feature, users will need to navigate to the Settings section followed by Accessibility, then touch on the ‘Tap To Alexa’ option. Additional information about the Echo Show can be found on the Amazon website.

ATO staff upskill in preparation for WCAG 2.1

Last week it was a great privilege to provide a webinar to staff from the Australian Taxation Office (ATO), supporting their upskilling processes around the new WCAG 2.1 standard.

The webinar included 11 staff across four states with a focus to support ICT professionals, internal accessibility specialists and content producers as to how the mobile-focused WCAG 2.1 could be integrated into their work processes.

The half-day webinar focused on why digital access for people with disability is important, how to interpret the legacy WCAG 2.0 success criteria in a mobile context, an overview of the new WCAG 2.1 success criteria to Level AA and the implications of future standards developments. The webinar also included two practical activities including providing participants with an opportunity to use a screen reader and how to test websites with an automated tool.

While many ATO staff were already actively working to make digital content accessible, the guidance provided in the webinar helped to highlight a more mobile-specific focus in preparation for the WCAG 2.1 rollout mentioned in a recent tweet by the Digital Transformation Agency (DTA).

Thanks again to the staff at the ATO for the opportunity to provide the webinar – it’s wonderful to see so many people dedicated to the pursuit of digital access. If any other government department is interested in being upskilled in WCAG 2.1, please get in touch.