In 1998, Section 508 of the Rehabilitation Act came out, as did the first version of WCAG. In the time period of 1998-2000, a number of new companies were created around Web Accessibility. Major players in the US like Deque, The Paciello Group, and Bartimeus Group (later acquired by/ merged with SSB Technologies, becoming SSB BART Group), TecAccess and more were all founded in that time period. Some of the founders of these companies had been in accessibility for years already while some of them were completely new to it and saw the new Section 508 requirements as a potential business opportunity. After all, the new 508 requirements meant that government agencies needed to make their stuff accessible and, as a consequence, the agencies would (should) be passing those requirements to their vendors. In some ways this was exactly the case and vendors did rise to the occasion. Similarly, we’re facing new opportunities in accessibility now. Since 2015, the rate of lawsuits and demand letters around web accessibility has clearly gotten the attention of more entrepreneurial types who see the uptick in litigation as an opportunity to make money.
Using this opportunity to make money isn’t a bad thing. The potential in this market is quite high, and the accessibility industry is sorely in need of a larger talent pool for the available work. The problem, unfortunately, is that I’ve recently begun noticing new players in the consulting space who are offering poor quality work at low prices. Another problem: companies marketing products that promise magical “fixes” to customer websites that simply do not work. One company, Agilitech says their product can give you “Section 508 Accessibility Compliance With a Single Line Of Code”. This is demonstrably false.
I’m not the only person to feel this way.
Survey time! Please answer ‘Yes’ or ‘No’: All of your website accessibility issues can be automatically fixed by an overlay product.
— Karl Groves ???????? (@karlgroves) May 21, 2018
Only 1 respondent to the above survey answered “Yes”. Gee, I wonder who they worked for?
Why you cannot magically automate your way into compliance
It should be universally understood that automated testing tools cannot offer complete test coverage for all possible accessibility issues on the web. Therefore it stands to reason that if you cannot automatically find all your site’s accessibility issues, you certainly cannot automatically fix all of them, either. This is extraordinarily simple logic. In fact, automatically fixing issues is even less likely to be successful than finding them. This fact is demonstrated within Mallet. While Mallet is extremely good at finding & fixing some issues, it is still limited to around 2-dozen types of ‘fixes’ that work on their own without any configuration. The remainder of Mallet’s fixes require some level of configuration. To put this into perspective, Tenon.io has approximately 200 accessibility tests that look for around 2500 different failure conditions. In other words, we can easily find at least 10x as many issues as we can reliably fix, because fixing the issues requires far more knowledge about the user interface than an automated tool has.
Top 85% of automatically discoverable issues, by volume
To see why claiming to automatically fix issues, let’s take a look at the top automatically discoverable accessibility issues, by volume, with an eye to the potential of actually fixing things:
- Insufficient contrast. Quite simply, the tests for sufficient color contrast reviews the computed colors for an element’s foreground and background to verify color contrast is sufficient according to WCAG standards. By volume, insufficient color contrast is the most common accessibility issue on the Web. Accurately testing for color contrast is far harder than even most accessibility people understand. Testing for basic foreground and background colors is the easy part. There are so many other ways to get it wrong. Does the element use absolute positioning? Does the element have a background-image? Does the background include a CSS3 gradient? Does the element use an animated background? All of these and more can make it impossible to accurately test the contrast and, as I’ve already said, if you can’t accurately test for something you certainly can’t accurately fix it.
<table>
missing header cells or missing header-cell-to-data-cell relationships. When it comes to data tables, this is the most common issue. There is no reliable way to accurately fix this problem. Even if you have a perfectly symmetrical table with an even number of rows and columns throughout the table, you can’t know whether the developer meant the top row for headers or the first column. Or maybe both are meant to be headers? Add in some colspans and rowspans and you can forget automatically fixing this at all.- Link has a
title
attribute that is the same as the text inside the link.. This is quite common, especially in websites that use Drupal and WordPress. This is really the only issue you can automatically fix with 100% certainty. All you need to do is remove thetitle
attribute. Overall, the issue of redundant title attributes is only a minor annoyance to users of screenreaders and they have the ability to adjust their verbosity settings to annoy the titles. - Link quality, (i.e. Links have the same text but different destinations; Links have identical `href` attribute values but different text; Adjacent links going to the same destination; Link has no text inside it; etc.). Link quality is generally abysmal on the web. Links should clearly and concisely communicate to users the content to be found or the actions that the user will perform at the destination. While we can often effectively test for things that are symptoms of bad links, we cannot automatically replace that link’s content with something better without knowing far more about both the current page and the destination.
- Image is missing a text alternative/ Image has low quality text alternative. The lack of useful text alternatives to non-text content is a huge problem on the web. While Tenon.io currently has about 50 individual tests around WCAG 1.1.1, we still cannot reliably tell whether the site fully complies with this Success Criterion. Automation – up to and including using the current state-of-the-art in Artificial Intelligence – is wholly unable to supply an effective text alternative for non-text content. Is the image decorative, or is it meaningful? Does the image function as an icon? Even if you can use AI to know what the image contains, there’s no way of knowing what it is meant to represent to users and therefore no way of automatically fixing this issue reliably.
- Control uses CSS-generated content without an effective label. Similar to the above, this is often caused by developers who use icon fonts on controls. What happens in these cases is that although the glyph is presented visually, the control does not have a usable label exposed to assistive technologies. Automatically testing for this is super easy, but automatically adding an accurate label to these controls is impossible without knowing what the glyph is meant to represent to the user.
- Poorly structured dynamic elements(
<a>
element withonclick
, nohref
, notabindex
, norole
; Event handlers bound to non-actionable element that lacksrole
and/ortabindex
; Link uses an invalid hypertext reference; Element has orphaned aria attributes; Element uses multiple strategies to create labels) Poorly structured dynamic elements are quite pervasive and a symptom of developers who do not understand the importance of accurately conveying the Name, State, Role, and Value of dynamic controls. They may also have an incomplete or inaccurate understanding of WAI-ARIA. These issues are easy to find and impossible to fix completely. There’s truly no way to automatically know what the control should be conveying to assistive technologies. - Form element has no label. By volume, this is the most common issue relating to forms. For assistive technology users, this issue is a huge problem. Without proper labels on form fields, assistive technology users will be unable to successfully complete the form or can only do so with serious difficulty. There’s no way to automatically fix this issue reliably. As evidence of this, consider that even assistive technologies will only try to guess when there’s unstructured text immediately next to the control.
- Nested tables. This test looks for tables inside of tables. This test avoids cases where one of the tables has
role="presentation"
. Nested tables are often the result of using tables for layout and in many cases can cause considerable confusion for assistive technology users. There are two ways to fix this problem: 1) addrole="presentation"
to the table(s) that are used for presentation, or 2) Completely restructure the code so that there are no nested tables. There’s no way to automatically do either of these things with any accuracy because there’s no way know which table(s) are presentational. Maybe they all are. Maybe only one of them is. Maybe some of them are and some are not. At any rate, approach #2 might destroy the design and #1 might get it completely wrong.
As you can see from the list above, among the top 85% of accessibility issues (by volume), only one type of issue can be reliably fixed via automation – redundant title attributes – and that’s an annoyance-level issue that can be easily avoided by settings in the user’s assistive technology. By the way, that top 85% of accessibility issues amounts to approximately 10% of Tenon.io’s available accessibility tests, at most.
Some things are wholly impossible to test for and therefore impossible to automatically fix
As always, it is important to remind the reader that there are limits to what can be discovered with automatic testing. It stands to reason that if you cannot automatically test for something you definitely cannot fix it, either.
- Captions/ transcripts for audio-only content or audio content in videos.
- Audio description for video content
- Content that relies on sensory characteristics to understand
- Audio control
- Keyboard trap
- Pause, Stop, Hide
- Error prevention and error handling
- Effective focus management
- Reading level
The above is only a subset of the many things in web accessibility that cannot be accurately tested via automated means. Many of them are hugely impactful for people with disabilities.
Companies that claim to automatically make you “compliant” are selling lies
Conformance to a standard means that you meet or satisfy the ‘requirements’ of the standard. In WCAG 2.0 the ‘requirements’ are the Success Criteria. To conform to WCAG 2.0, you need to satisfy the Success Criteria, that is, there is no content which violates the Success Criteria.
To review what I’ve discussed so far:
- An automated web accessibility testing tool can definitively test for approximately 25-29% of best practices for WCAG 2.0
- Of the top 85% of automatically discoverable web accessibility issues, only one of those issues can reliably and accurately be fixed via automated means
- There are approximately a dozen additional fixes that can be automatically applied. By volume, most of these issues exist in that 15% minority, by volume. Some of them also require customization.
- An automated web accessibility testing tool cannot test for approximately 40% of best practices for WCAG 2.0 in any meaningful way and therefore cannot be fixed automatically, either.
- Many of those untestable items are huge roadblocks for users with disabilities.
Buying into the false promises of magical automatic compliance will only prolong your risk exposure. These products cannot bring you into compliance, cannot make your site more accessible to users with disabilities, and therefore cannot reduce your risk of lawsuit or administrative complaint.
An open challenge to all vendors of such products
Just like weight loss or financial stability, there is no quick magical fix for accessibility and I challenge any vendor of such a product to debate me on this topic at CSUN 2019. Together, we can co-author a talk submission where each of us gets equal time to present our case.