On Global Accessibility Awareness Day 2019, Co-founder of the Centre For Accessibility initiative Dr Scott Hollier launched Australia’s first dedicated Access Awards at an event hosted by VisAbility.
A core reason for the Awards is due to the fact that accessibility of websites and apps is not always easy to identify visually, but has a significant impact on the independence of people with disability. The Australian Access Awards is about celebrating the organisations, service providers and designers/developers that make the effort to support people with disability, but to date have received little recognition for that work.
This Centre for Accessibility initiative is a chance for everyone in Australia to acknowledge best practice, celebrate a job well done and encourage organisations that may not have a good understanding of digital access to step forward and have a go at making their content accessible.
Entries are now open!
Anyone can nominate a website or app for an Award in the appropriate category. Nomination is free and we invite any organisations to submit themselves for an Award. We also encourage anyone within the disability community to make a nomination based on their own personal experiences.
Centre For Accessibility founders, DADAA, Media On Mars and Dr Scott Hollier, would like to thank sponsors VisAbility, Web Key IT, OZeWAI, ACCAN, the Centre for Inclusive Design and the Attitude Foundation for their support of the Awards.
announced at its 2019 I/O developer
conference that the upcoming version of Android, currently codenamed ‘Android
Q’, will feature some significant accessibility improvements relating to the automated
captioning of video and the addition of search to Google Lens.
The Live Caption
feature will allow users to download a video to their device and play it back
with captions regardless as to whether the video was formally captioned or not.
This makes use of similar technologies currently found in YouTube’s automated
captioned service whereby Google scans a video and adds captions for you. The
main difference here is that the ability to scan a video is built into Android Q,
and the process appears to be relatively instantaneous once a video is downloaded
to a device.
While the Live Captioning feature is focused primarily on pre-recorded videos, it has also been demonstrated with real-time video calls. This has the potential to improve the communication options for people who are Deaf or hearing impaired worldwide. The following YouTube video showcases the feature in action.
promise of every video featuring captions and even live calling is extremely
exciting for people who are Deaf or hearing impaired, there are currently few
options to test the feature at this time of writing outside of specific beta
testing programmes. There is also some scepticism about its accuracy given that
the effectiveness of the YouTube automated captions feature relies heavily on
broadcast-level audio quality and only caters for a limited number of languages
feature is primarily focused on people with a hearing disability, it is likely
to have wider benefits for people wanting to watch video content in noisy environments
such as on a bus or plane.
of availability, people using Pixel and recent mobile devices affiliated with
the Android One programme are likely to receive the update before the end of
the year. Once a device is updated to Android Q, the feature can be enabled in
the device settings.
improvement is an update related to Google Lens. Google has incorporated search
and some additional real-time functionality to help people interact with your environment
by taking a photo.
to Natt Garun from The Verge, “Google says Lens can search for exact dishes
on a menu and surface photos of that dish based on Google Maps information to
show you just how it looks before you order. You can also point the camera at
the receipt to bring up a calculator that lets you add a tip then split the
bill or at a sign in a foreign language to hear a text-to-speech translation.”
Lens remains a popular feature in Android for people who are blind or vision impaired
as it allows for a person with a vision disability to take a photo and find out
what is in the surrounding environment. The added functionality is likely to
continue making Google Lens more useful.
features aside, the one remaining mystery about Android Q is its name. Google
traditionally names its android releases after sweet treats and in alphabetical
order, but as there aren’t many desserts that start with ‘Q’ it will be
interesting to see what choice Google makes.
World Wide Web Consortium (W3C) Web Accessibility Initiative (WAI) has significantly
improved its content so that it is much easier to locate documents such as the
Web Content Accessibility Guidelines (WCAG) standard documents and associated
resources in a variety of languages.
The All Translations section on the W3C
website provides expandable menus based on a number of languages whereby the
selection for a particular language will expand to show all current W3C WAI
translations in that language.
resource currently features categories based on the following languages:
addition, W3C WAI appears to be relaxing its strict processes relating to
document translations, seeking
volunteers to get involved. This will hopefully increase the number of
documents supported in other languages.
the time of writing some language categories are placeholders such as Arabic, it
is encouraging to see more effort being put into an area which has traditionally
been a weak point in the provision of accessibility standards in different languages.
“The Accessible Platform Architectures Working Group has published a Working Draft of a revision to Inaccessibility of CAPTCHA at: https://www.w3.org/TR/turingtest/ Inaccessibility of CAPTCHA has been a Working Group Note since 2005. It describes problems with common approaches to distinguish human users of web sites from robots, and examines a number of potential solutions. Since the last publication, the abilities of robots to defeat CAPTCHAs has increased, and new technologies to authenticate human users have come available. This update brings the document up to date with these new realities. It is published as Working Draft to gather public review, after which it is expected to be republished as a Working Group Note.”
latest version of the draft includes a general restructure of the Note, new
guidance relating to Google reCAPTCHA and the increased use of data collected
over time to determine the likelihood of a user being a robot or a human.
As an invited expert for the W3C WAI APA Research Questions Task Force (RQTF), it’s been a privilege to work with Janina and Michael on updating the note alongside the hard work of all the RQTF members. As the Note continues to be refined ready for publication it remains a great experience to be involved in the process.
Perth Web Accessibility Camp was held on 12 February at VisAbility and a fantastic day was
had by all. With over 100 people in attendance and a great diversity of
presentations, it was a great opportunity to talk about digital access from a
variety of perspectives. Here’s a selection of my personal highlights from the
The Keynote was delivered by Professor Denise Wood, Central Queensland University. The topic, titled ‘Designing Culturally Responsive and Inclusive Online Learning Environments: An Evidence-Based Approach’, discussed how people with disability engage with learning tools and some of the challenges they may face. The key takeway message for me was that accessibility issues are much less about the online learning platform used by the institution and much more about how the content on top of it is designed. It can also be useful to students to include additional accessibility tools to support students broadly.
Next up was ‘Here comes WCAG 2.1!’ by Amanda Mace from Web Key IT
and Julie Grundy from Intopia. There was some great discussion across the new
WCAG 2.1 Success Criteria, explaining the importance of things like reflow and
ensuring that content on mobile devices needs to work effectively for people
that may not be able to move their device to activate various sensors. With WCAG
2.1 gradually being adopted internationally, it was a great introduction as to
how the new extensions build on the legacy WCAG 2.0 requirements.
break it was my turn, providing an update to the W3C advice on inaccessible
CAPTCHA. In the presentation I talked about how traditional CAPTCHAs such as
the use of text on bitmapped images and audio-based CAPTCHAs are not only
inaccessible but also not secure. I also provided an update on the advice our
group has been putting together as part of the CAPTCHA advisory note. It was
great to have a chance to share the information.
A topic that is starting to get more attention was highlighted by Vithya Vijayakumare and David Vosnacos from VisAbility discussing the access implications of 360 degree video. In particular the exploring of captioning positioning for people who are Deaf or hearing impaired and how binaural recording can be used to provide an effective surround experience. This is rapidly becoming a hot topic in international standards discussion so the presentation was both timely and informative.
An important emerging topic that was discussed was from Claudia De los Rios Pérez from Curtin University who discussed the implications of Web design for neurodiverse users. The needs of people with Autism and similar conditions can be overlooked in the pursuit of WCAG compliance so it was good to get some guidance on how to structure websites in a way that better supports the diversty of users.
While all these presentations were fantastic, my favourite from the Camp was from Clare Chamberlain on the topic ‘Negative Life Trajectory – a battle for Plain English’. The topic really challenged the audience to consider the implications of language and the need to carefully consider our messaging. The takeway for me is that we tend to bury our websites in complicated language and clutter which affects a number of different disability groups, yet in most cases the same message can be delivered effectively through some simple restructuring and rephrasing. Prior to this presentation I’d always thought it was quite challenging to simplify language without the loss of meaning, but the presentation demonstrated it can be done quickly and effectively with a bit of time and consideration.
In addition to the presentations, it wouldn’t be a Perth Web Accessiblity Camp without the infamous Great Debate which is up to its sixth year with the fiery topic ‘Paying extra for accessibility is totally worth it’. The debate did its job well in waking up the audience after lunch and providing some great food for thought along the way.
Many thanks to my colleagues in the Camp organising committee for what was a fantastic day and VisAbility for hosting the event.