Skip Navigation

Understanding the limitation of automated accessibility testing

Making sure your digital experience is accessible to everyone is key to inclusivity (and also a legal requirement for public sector bodies).

In this post, we'll quickly explore automated accessibility testing tools and discuss why they shouldn't be the sole approach to testing your website for accessibility issues.

What are automated testing tools?

Automated accessibility testing tools are designed to identify accessibility issues quickly and efficiently.

They scan web pages and report on potential problems, such as missing alternative text for images, colour contrast issues, or keyboard navigation problems.

What's wrong with automated accessibility testing?

There is nothing inherently wrong with automated accessibility testing. In fact, I use Axe dev tools daily as a companion to ensuring my code meets WCAG 2.1 guidelines.

However, the issue arises when only automated testing is used as a metric of website accessibility. Why is this? well research conducted by gov.uk suggests that these tools may miss up to 71% of issues.

So what should I do?

Embracing a multifaceted approach to accessibility testing should be your goal, this includes:

  • Automated testing tools
  • Manual testing with or without a guiding tool like Microsoft accessibility insights
  • Testing with assistive technologies (i.e. Voiceover, JAWs etc)
  • Get a third party accessibility audit
  • Test with real people

A multi multifaceted like this ensures that all bases are covered in terms of website accessibility testing. Each approach complimenting one another.

Testing automated testing tools

Initially, I planned to put together a super comprehensive set of tests to demonstrate the pitfalls of automated testing. However, within the first half-hour, I found out how I could pass automated testing tools with an extremely inaccessible webpage.

Extreme tests

Here is an example of the gov.uk homepage, showing zero accessibility errors with Axe.

A screenshot of the gov.uk website, with accessibility tool axe showing 0 accessibility errors.

Lets see if the automated tool picks up that there is a blur effect applied to the page.

Gov.uk homepage with a blur effect applied to it, making it difficult to read.

Nope, it still looks like the page has no accessibility issues.

Lets go more extreme and flip the website upside down.

Screenshot of the gov.uk website flipped upside down, with the automated testing tool reporting 0 errors.

Nope, nothing is picked up in our automated accessibility tool.

This small, and extreme example highlights why automated accessibility testing should not be carried out in isolation.

Findings

This post aims not to discredit the usage of automated accessibility tools. In fact, I believe that they are an essential part of any modern development pipeline. (The Axe tool is great in that it guides you into manual testing after running automated ones.)

However, to ensure that your website is accessible to all users, it's crucial to leverage the benefits of automated testing in conjunction with other testing methods, such as manual testing, third-party auditing, and testing with actual people.