Skip to content

Category: News

Amazon updates Echo Show to support speech and hearing impaired users

The Amazon Echo Show, a digital smartspeaker that includes a touchscreen, has received a new feature to make it easier for people that have a speech or hearing impairment to interact with the device.

The feature called ‘Tap To Alexa’ allows the user to tap the screen to interact with the Alexa digital assistant instead of using verbal commands. Once the feature is enabled, users can tap on the information they want instead of providing the equivalent verbal command.

In a recent article by Mallory Locklear at Engadget the updates were described as follows:

“The feature includes shortcuts to common Alexa items like weather, timers, news and traffic, and users can also type out Alexa commands. Additionally, while Amazon launched its Alexa captioning feature in the US a few months ago, it’s now releasing that feature to users in the UK, Germany, Japan, India, France, Canada, Australia and New Zealand.”

The following YouTube video also provides a bit more detail on how the new feature works.

The primary difference between the entry-level Amazon Echo smart speakers and the Show is the addition of a touchscreen which provides visual information in addition to audio-based feedback. As such, this makes the device much easier for people who are Deaf or hearing impaired to use as they can visually see the information. The addition of caption support also helps in mirroring the verbal interactions with text on the screen.

To enable the Tap To Alexa feature, users will need to navigate to the Settings section followed by Accessibility, then touch on the ‘Tap To Alexa’ option. Additional information about the Echo Show can be found on the Amazon website.

ATO staff upskill in preparation for WCAG 2.1

Last week it was a great privilege to provide a webinar to staff from the Australian Taxation Office (ATO), supporting their upskilling processes around the new WCAG 2.1 standard.

The webinar included 11 staff across four states with a focus to support ICT professionals, internal accessibility specialists and content producers as to how the mobile-focused WCAG 2.1 could be integrated into their work processes.

The half-day webinar focused on why digital access for people with disability is important, how to interpret the legacy WCAG 2.0 success criteria in a mobile context, an overview of the new WCAG 2.1 success criteria to Level AA and the implications of future standards developments. The webinar also included two practical activities including providing participants with an opportunity to use a screen reader and how to test websites with an automated tool.

While many ATO staff were already actively working to make digital content accessible, the guidance provided in the webinar helped to highlight a more mobile-specific focus in preparation for the WCAG 2.1 rollout mentioned in a recent tweet by the Digital Transformation Agency (DTA).

Thanks again to the staff at the ATO for the opportunity to provide the webinar – it’s wonderful to see so many people dedicated to the pursuit of digital access. If any other government department is interested in being upskilled in WCAG 2.1, please get in touch.

Australian Digital Transformation Agency commits to WCAG 2.1 AA update

The Digital Transformation Agency (DTA), a Federal government department that oversees the national Digital Service Standard (DSS), issued a tweet recently confirming it will be moving to WCAG 2.1.

In its tweet, the DTA stated that:

“’There will be 17 new criteria in #WCAG 2.1 to cover new digital tools and understanding mobile, low vision and cognitive. @DTA will update our guides to meet WCAG 2.1 criteria.’ #contentstrategy #accessibility #dtachats

While the tweet is good news, indicating that the Australian Federal government will be moving to WCAG 2.1, there was initially some confusion with the announcement. Given there are only 12 Success Criteria in WCAG 2.1 Level AA, the tweet inferred that the Federal government would be moving to Level AAA to implement all 17 Success Criteria. If true, this would have been a major departure from existing government policy.

To clarify this issue, I sent a reply to the DTA’s tweet, checking if they were indeed planning to implement all 17 Success Criteria. In response, the DTA stated:

“Hi @scotthollier, we apologise for the delay in responding. There
are no plans to move to AAA, but as we explore the updates to WCAG 2.1 we’re
issuing new best practice advice to help agencies understand the changes.”

This suggests that the new Federal government requirement described by the Digital Service Standard is likely to be based on a WCAG 2.1 Level AA conformance, incorporating the 12 largely mobile-focused new Success Criteria.

If you wish to upskill your staff in preparation for the new DTA requirement, you can find additional information in the Consultancy and Training section of this webiste. There is also a free WCAG 2.1 resource at the Centre For Accessibility website.

W3C WAI CAPTCHA Note update open for public comment

It’s with much excitement that I can share with you some W3C work I have been involved in relating to the update of W3C WAI advice on the inaccessibility of CAPTCHA. The Note has now reached  the public review stage and feedback is welcome.

In an e-mail to the WAI Interest Group (WAI-IG), fellow editors Janina Sajka, Accessible Platform Architectures WG Chair and Michael Cooper, Accessibility Guidelines WG W3C Staff Contact, announced that:

“The Accessible Platform Architectures Working Group has published aWorking Draft of a revision to Inaccessibility of CAPTCHA at:https://www.w3.org/TR/turingtest/ Inaccessibility of CAPTCHA has been a Working Group Note since 2005. It describes problems with common approaches to distinguish human users of web sites from robots, and examines a number of potential solutions. Since the last publication, the abilities of robots to defeat CAPTCHAs has increased, and new technologies to authenticate human users have come available. This update brings the document up to date with thesenew realities. It is published as Working Draft to gather public review,after which it is expected to be republished as a Working Group Note.”

As an invited expert for the W3C WAI APA Research Questions Task Force (RQTF), it’s been a privilege to work with Janina and Michael on updating the note alongside the hard work of all the RQTF members.  This is the first time I’ve been involved in the W3C editorial process and the experience has been very rewarding.

With the first draft complete, the Note is open for public comment so that additional improvements to the advice can be included. Additional information about this publication can be found in the CAPTCHA Note blog post.

Google Podcasts app to introduce auto-transcription feature

Google has recently launched its new Podcasts app for Android devices with a commitment to include an auto-transcription service in the near future.

The Google Podcasts app is designed to make it easier for Android users to search and subscribe to podcasts, a feature which has worked well on Apple iOS devices but has largely eluded Android users to date since Google abandoned its Listen experiment in 2012.

While the Android platform can provide podcasts via third-party apps along with some podcast features in other Google apps, the new standalone Podcasts app aims to provide a simpler experience with a promise to introduce an important accessibility feature – the ability to have podcasts auto-transcribed.

In an article written by Nicholas Quah for Hot Pod titled Could Google’s new podcast app change the way we understand the Average Podcast Listener?, the potential benefits of the app are listed as:

  • Greatly decreasing the friction from search results to an actual mobile listening experience, thus operationalizing searches as a true top of the funnel;
  • AI-assisted features like quick transcription, greater in-episode searchability, automatic visual subtitling across multiple languages, and content-indexing, which will presumably give audiences more control over the judgment and navigating of a listening experience (and, also presumably, put some speech-to-text transcription companies out of business);
  • Cross-device syncing, which allows users to easily transition between listening on a smartphone or through a smart speaker;
  • Direct monetization features, like the possibility of a “donate” button.

For people who are Deaf or hearing impaired, the potential inclusion of an auto-transcription feature into the Podcasts app would be highly beneficial in providing access to a wealth of audio-based online content. While similar Google initiatives such as auto-captioning on YouTube have been met with a mixed reception due to quality issues, the professional audio quality of most podcasts is likely to make the auto-transcription services more useful and accurate.

The Google Podcasts app can be downloaded now from the Google Play Store. There are currently no plans for an Apple iOS release.