Skip to content

Dr Scott Hollier - Digital Access Specialist Posts

Dr Scott Hollier to give public ‘seeing without light’ lecture at UWA

It’s a great privlege to let you know that on Wednesday 18 April I’ll be giving a free public lecture at the University of Western Australia titled ‘The Seeing without Light: how people with disability are embracing emerging technologies’.

My presentation will discuss how the rapid evolution of computers and mobile devices has had a significant impact on how we engage online and with each other. Yet for people with disabilities, including visual impairment, such technologies represent far more than just the sum of their parts – it is ultimately a gateway for independence. With emerging technologies such as virtual reality, augmented reality and the Internet of Things, how can we ensure that people with disability continue to be a part of our digital culture? I’ll demonstrate how people with disability are currently able to engage with consumer devices along with the benefits and issues associated with our new and emerging consumer digital needs.

This talk is part of the 2018 Light Talks series, “Living with and without light.” The aim is to raise awareness about the experience of vision impaired people in a globalized and technological world. This series is presented by UWA Optical Society (OSA) student chapter and the UWA Institute of Advanced Studies.

If you live in the Perth area and would like to come along, the details are as follows:

When: Wednesday 18 April 2018

Time: 6pm-7pm

Where: Austin Lecture Theatre, UWA Arts Building

Cost: Free

RSVP: online via

Thanks very much to the organisers for the opportunity and looking forward to hopefully seeing some of you there.

Eye-D Android app now also available on iPhone

The popular Eye-D Android app, a ‘swiss-army knife’ style collection of tools designed to assist people who are blind or vision impaired, is now available for iOS devices such as the Apple iPhone.

Eye-D app menu screen

Screenshot of Eye-D app menu screen

The Eye-D app contains a number of useful accessibility features with a particular focus on supporting people who are blind or have low vision. Features include the ‘Where am I?’ function which uses GPS to provide users with the closest street addresss, the ability to identify an object in an image and the ability to take a photo of text which is then read out. The app also has an ‘around me’ function which allows the user to find key features in their surroundings such as food places, bus stops, banks, and cinemas, then provides the option to send the information to Google Maps so the user can navigate to that location.

While the app has been available on Android devices for several years, the app has only just launched for the Apple iPhone. The standard Eye-D app is free and a Pro version is available for purchase. A second collection of features available for purchase includes an advanced OCR scanner for different languages and a colour identifier.

The arrival of this app and the similar Seeing AI app by Microsoft on the iPhone marks a recent trend towards bundling multiple features into one app.  Further information on the Eye-D app can be found in the Apple App Store and Google Play Store respectively.

ACT TF & YouDescribe – two highlights from CSUN 2018

It is with a bittersweet feeling that I write this article about CSUN 2018. Originally I was planning to be there to present on my Internet of Things research provided to W3C. Sadly though it wasn’t meant to be with a medical issue flaring up leading me to cancel my travel just after check-in. It’s a strange feeling to ask the airline to retrieve your luggage after it has already gone through! 

While I didn’t make it to the conference myself, there were lots of amazing presentations, and after touching base with several people who did present I wanted to pull together two things which really stood out for me coming out of the conference.  

Accessibility Conformance Test (ACT) Task Force leadership

Shadi Abou-Zahra from W3C gave a great presentation about the work of the Accessibility Conformance Test (ACT) Task Force which is focusing on providing support to Test tool developers, test professionals, industry, manufacturers of technology, and procurers of accessible technology among others in the testing of accessibility. The goals of the ACT TF are to:

  • Reduce differing interpretations of WCAG
  • Make test procedures interchangeable
  • Develop a library of commonly accepted rules for WCAG
  • Establish a community of contributors

While WCAG-EM 1.0 is already in place to help people create an accessibility auditing process, the guidance is quite broad. The upcoming standards work of WCAG 2.1 and Silver highlight the need to bring a uniform and consistent mechanism of procedures to auditing testing processes.

I’ve personally seen the auditing processes of four companies and while they all provide effective guidance, they’re also all significantly different – including my own approach. The work of the ACT TF will be an exciting development going forward and really help bringing people who work in the industry together with a consistent interpretation of the relevant standards.


A second presentation I was sad to miss out on but have managed to catch up with is about YouDescribe, a way to create audio described videos on YouTube for free. The website explains how it works like this:

“Sighted people view YouTube videos and record descriptions of what they see. When the video is played with YouDescribe, the descriptions are played back with the video. Underneath the hood, YouDescribe uses an exclusive API to store description clips and information about them. YouDescribe knows what video each clip belongs to and what time the clip should be played. Lots of other information is stored along with the descriptions, including who recorded it, when it was recorded, how popular it is, etc. YouDescribe is the first video service to allow anybody, anywhere, to record and upload video descriptions to the cloud. It provides a unique way for people to get descriptions for the instructional, informational, and entertainment videos offered on YouTube.”


Due to audio described video being of benefit to people who are blind or vision impaired, and its notable lack of implementation here in Australia, YouDescribe potentially makes the creation of audio described content much easier to do and easier to find. While YouDescribe is not a brand-new service, I’ve found it hard to track down concrete information until the CSUN presentation which is one of the things that makes such conferences great.

Internet of Things report now available  

Also, while talking about CSUN 2018, I’d like to take this opportunity to sincerely apologise to anyone that was trying to find me or turned up to my presentation slot due to the late cancellation. If you would like to read the full report of my Internet of Things research that was prepared for W3C, you can find a link to it in the Publications list of the W3C Web of Things wiki.

Google Lens receives Assistant support and wider release

Last year, Google announced the launch of its new Lens feature, designed to not only provide information about an image, but connect it to real-world information. The exciting news is that the feature, initially limited to Google’s own Pixel smartphones, is now being rolled out to most Android users in an update to the Photos app. iPhone users will also receive Lens at a later date.

At the time I mentioned that for people who are blinder vision impaired, the Lens feature has the potential to provide significant benefits. While there are several effective apps available on mobile devices that can deliver image recognition and OCR capabilities, Lens has the additional benefit of connecting the image with meaningful data that is likely to be useful while the user is in that specific location. For example, a blind user could take a photo of a café and not only have the café itself identified, but a menu could be provided at the same time along with the accessibility of the building.

In addition, the feature is being added to the Google Assistant. According to an article by Android Police, “Lens in Google Photos will soon be available to all English-language users on both Android and iOS. You’ll be able to scan your photos for landmarks and objects, no matter what platform you use. In addition, Lens in Assistant will start rolling out to “compatible flagship devices” over the coming weeks. The company says it will add support for more devices as time goes on.”

With the Google Assistant also receiving the ability to use Lens, it will make the feature much easier for people with vision-related disabilities to simply speak to theirphone to identify their surroundings. Additional information on the feature can be found at Google’s Lens information page.

WCAG 2.1 Candidate Recommendation – what it means for Level AA compliance

In January 2018, the hard work of W3C resulted in its new   Web Content Accessibility Guidelines (WCAG) 2.1 standard reaching Candidate Recommendation stage.  This  means that, barring any significant issues in its real-world testing, the standard is close to completion and on track for its mid-2018 release date.

Last year I wrote an article titled WCAG 2.1 draft: reflections on the new guidelines and success criteria.  It’s fair to say that a lot has changed since then and that is ultimately a good thing. Importantly, the purpose of WCAG 2.1 is NOT to replace the well-established WCAG 2.0 standard, but rather to extend support for the mobile web. In my view, WCAG 2.1 as it stands does this very well with some additional success criteria added to existing guidelines, and a much-needed focus on supporting mobile platforms by providing two new guidelines.

With WCAG 2.1 reaching a near-complete stage its  a great time to look at all this in a bit more detail and focus on what additional work would be required by ICT professionals to meet WCAG 2.1 Level AA compliance.

Before we get started, it’s important to highlight two things: firstly, I’m not going to go through all the guidelines and success criteria associated with WCAG 2.0, just the WCAG 2.1 extensions.  If you’d like more information on the original WCAG 2.0, I’d be happy to provide you with some resources to get up-to-date with the international standard.

Secondly, I’m going to focus particularly on Level AA compliance. A number of the new WCAG 2.1 success criteria are Level AAA so for now I’ll leave those out given most policy and legislative frameworks don’t go to Level AAA.

Two new guidelines

So with those things in mind, let’s start with the two new guidelines. The addition of the guidelines brings the total in WCAG 2.1 to 14 with both guidelines sitting under the Operable design principle. The new guidelines are:

  • 2.5 Pointer Accessible: Make it easier for users to operate pointer functionality.
  • 2.6 Additional sensor inputs: Ensure that device sensor inputs are not a barrier for users.

Guideline 2.5 is focused on making sure that whatever type of pointer is being used, such as a mouse pointer, a finger interacting with a touch screen, an electronic pencil/stylus or a laser pointer, all functions should work correctly. It may be the case that a person with a disability finds it easier to use one type of pointing device over another, so from an accessibility standpoint all options should be available.

Guideline 2.6 relates to the use of sensor inputs such as a gyroscope or accelerometer found on mobile devices. From an accessibility standpoint, it may be the case that users can’t operate these sensors such as the inability to tilt a mobile phone if it is mounted on a wheelchair. As such, it is essential that people with disabilities are able to achieve the same outcome through the user interface if they are unable to use a sensor-based input method.

In addition to the two new guidelines, there are also new success criteria that have been added to existing guidelines. The following is a list of all Level A and AA WCAG 2.1 success criteria extensions.

WCAG 2.1 Level A

Success Criterion 2.4.11: Character Key Shortcuts

If a keyboard shortcut is implemented in content using only letter (including upper- and lower-case letters), punctuation, number, or symbol characters, then at least one of the following is true:

  • Turn off
  • Remap
  • Active only on focus

Success Criterion .2.4.12: Label in Name

For user interface components with labels that include text or images of text, the name contains the text presented visually.

Success Criterion 2.5.1: Pointer Gestures

All functionality that uses multipoint or path-based gestures for operation can be operated with a single pointer without a path-based gesture, unless a multipoint or path-based gesture is essential.

Success Criterion 2.5.2: Pointer Cancellation

For functionality that can be operated using a single pointer, at least one of the following is true:

  • No Down-Event
  • Abort or Undo
  • Up Reversal
  • Essential

Success Criterion 2.6.1: Motion Actuation

Functionality that can be operated by device motion or user motion can also be operated by user interface components and responding to the motion can be disabled to prevent accidental actuation, except when:

  • Supported Interface
  • Essential

These essential WCAG 2.1 success criteria demonstrate the importance of catering for people with disabilities on the mobile platform.  Success criteria 2.4.11 and 2.4.12 add additional support in helping users navigate and find content by ensuring that customised keyboard shortcuts don’t interfere with assistive technology and improvements in labelling identification.  The two success criteria 2.5.1 and 2.5.2 explain the importance of making sure that multi-gesture commands can be achieved using a pointer and that there is a consistent technique to cancel the action. The final success criteria in this list, 2.6.1, explains that the commands reliant on the movement of a device should also be achievable through the standard user interface.

One of the concerns that has been raised through the WCAG 2.1 process is that several of these success criteria have an ‘essential’ element which is viewed by many as a way in which developers can avoid the need to implement accessibility with the justification being that the design is essential to the functionality of the web content. However, I suspect in practical terms it will be difficult for a developer to argue that other accommodations are not possible if it is obvious that alternative interface controls could have addressed the issue, so it will be interesting going forward to see under what circumstances the ‘essential’ argument is put forward as a way of avoiding some WCAG 2. success criteria.

WCAG 2.1 Level AA

Success Criterion 1.3.4: Identify Common Purpose

The meaning of each input field collecting information about the user can be programmatically determined when:

  • The input field has a meaning that maps to the HTML 5.2 Autofill field names; and
  • The content is implemented using technologies with support for identifying the expected meaning for form input data.

Success Criterion 1.4.10: Reflow

Content can be presented without loss of information or functionality, and without requiring scrolling in two dimensions for:

  • Vertical scrolling content at a width equivalent to 320 CSS pixels;
  • Horizontal scrolling content at a height equivalent to 256 CSS pixels;

Except for parts of the content which require two-dimensional layout for usage or meaning.

Success Criterion 1.4.11: Non-Text Contrast

The visual presentation of the following have a contrast ratio of at least 3:1 against adjacent color(s):

  • User Interface Components
  • Visual information used to indicate states and boundaries of user interface components, except for inactive components or where the appearance of the component is determined by the user agent and not modified by the author;
  • Graphical Objects
  • Parts of graphics required to understand the content, except when a particular presentation of graphics is essential to the information being conveyed.

Success Criterion 1.4.12: Text Spacing

In content implemented using markup languages that support the following text style properties, no loss of content or functionality occurs by setting all of the following and by changing no other style property:

  • Line height (line spacing) to at least 1.5 times the font size;
  • Spacing following paragraphs to at least 2 times the font size;
  • Letter spacing (tracking) to at least 0.12 times the font size;

Word spacing to at least 0.16 times the font size.

Exception: Human languages and scripts which do not make use of one or more of these text style properties in written text can conform using only the properties that are used.

Success Criterion 1.4.13: Content on Hover or Focus

Where receiving and removing pointer hover or keyboard focus triggers additional content to become visible and hidden, respectively, the following are true:

  • Dismissable
  • Hoverable
  • Persistent

Exception: The visual presentation of the additional content is controlled by the user agent and is not modified by the author.

Success Criterion 2.6.2: Orientation

Content does not restrict its view and operation to a single display orientation, such as portrait or landscape, unless a specific display orientation is essential.

While many of the success criteria here may initially appear difficult to implement, the bundling of 1.4.x success criteria together make it helpful in understanding that these are all related to visual improvements. I’m particularly pleased to see that the issue of the infamous ‘double scrollbar’ scenario that screen magnification users face so often, particularly on small displays, is addressed to some degree here along with contrast elements, text scaling and hover issues. It is interesting to note again as to the remarkably specific level of detail in some of these success criteria and it’ll be interesting to see if the real-world testing confirms these values prior to the final WCAG 2.1 release.

I’m also pleased to see Orientation make it this far, and out of all the new success criteria I suspect this will be the thing that resolves the most frustration for people with disabilities overall on a mobile device. When a mobile website or app forces the user to use their device in portrait or landscape for no apparent reason it is a great frustration and significantly affects how the device can be used as it can change the reach of buttons, make swipe gestures more awkward or affect the use of screen real estate. While there is another ‘essential’ statement creeping in here, it’s been my experience that there are few apps that lock in an orientation that really need to do so, which leads me to be optimistic about the implementation  of this one.

Where to from here?

From here, it’s likely that WCAG 2.1 will be released this year as promised, and the take-away message for ICT professionals is to get ready to consider the access implications of mobile-specific elements suchas multiple input methods and sensor integration. For accessibility professionals, the biggest message here is that testing the accessibility of web content on a mobile device is no longer optional and new testing processes will need to be established.

That said, it’s been exciting through my W3C work to see the standard being developed and I’d like to say a big thank you to all the W3C staff, invited experts and volunteers who continue to give up their time to ensure that the mobile web receives a much-needed accessibility focus through the development of WCAG 2.1.

NOTE: at this time of writing, WCAG 2.1 is a Candidate Recommendation and is NOT an official W3C standard. As such, the guidelines and success criteria discussed in this article may differ from the final W3C recommendation.