The following data comes from automated accessibility testing of the Alexa Top 100 US websites (minus the pr0n, search engines, social networks and sites primarily driven from user content) using AQUA and lists their performance from worst to best (based on density of errors per page). This information comes with the important caveat that it is limited to automated testing. That being said, my own experience has been that poor performance in automated accessibility testing is strongly correlated with poor performance in manual accessibility testing as well.
Site
Name
Errors
Warnings
Pages Tested
Avg Per Page (Errors +
Warnings)
Avg. Per Page (Errors Only)
75
46
1
121.00
75.00
23691
9404
500
66.19
47.38
22746
35552
496
117.54
45.86
22255
18354
498
81.54
44.69
21663
2477
500
48.28
43.33
183772
131066
4979
63.23
36.91
17652
2697
499
40.78
35.37
383
1886
12
189.08
31.92
11456
5478
361
46.91
31.73
13868
13035
491
54.79
28.24
12850
19892
464
70.56
27.69
12104
27059
498
78.64
24.31
11863
34408
499
92.73
23.77
3987
1464
199
27.39
20.04
201
243
16
27.75
12.56
6085
6458
496
25.29
12.27
4824
15804
431
47.86
11.19
4690
6705
500
22.79
9.38
2908
848
361
10.40
8.06
3823
10364
483
29.37
7.92
2992
4822
498
15.69
6.01
60
56
10
11.60
6.00
833
401
142
8.69
5.87
2544
1814
474
9.19
5.37
1447
940
275
8.68
5.26
1803
745
368
6.92
4.90
1158
3289
338
13.16
3.43
1176
3831
479
10.45
2.46
2
272
1
274.00
2.00
489
6316
314
21.67
1.56
Some Definitions
Error
In this context, an “error” is something that was found using a test which us clearly ‘pass/fail’ in nature. Things such as missing alt attributes for images fall into this category.
Warning
In this context, a “warning” is something that was found using a test which does not have a clear ‘pass/fail’ criteria but would instead require human inspection to verify.