The Web Content Accessibility Guidelines 1.0 were published in 1999 and quickly grew out of date. The proposed new WCAG 2.0 is the result of five long years’ work by a Web Accessibility Initiative (WAI) committee that never quite got its act together. In an effort to be all things to all web content, the fundamentals of WCAG 2 are nearly impossible for a working standards-compliant developer to understand. WCAG 2 backtracks on basics of responsible web development that are well accepted by standardistas. WCAG 2 is not enough of an improvement and was not worth the wait.
Prepare for disappointment#section2
If you’re a standards-compliant web developer, you already know about web accessibility and are familiar with the only international standard on that topic, the Web Content Accessibility Guidelines. WCAG 1 just celebrated its seventh birthday and is closing in on the end of its life. WCAG 1 badly needs revision.
On 27 April 2006, WAI published the first instalment of the interminable sequence of documents required for the revision, WCAG 2.0, to become a standard.
If you were hoping for a wholesale improvement, you’re going to be disappointed. A lot of loose ends have been tidied up, and many low-priority guidelines are now pretty solid. The problem here is that standardistas already knew what to do to cover the same territory as those low-priority guidelines. Where WCAG 2 breaks down is in the big stuff. Curiously, though, and perhaps due to meticulous editing over the years, the big stuff is well camouflaged and, to an uninformed reader, WCAG 2 seems reasonable. It isn’t, and you as a working standards-compliant developer are going to find it next to impossible to implement WCAG 2.
Where to find the documents#section3
In the great tradition of the W3C, the actual WCAG 2 documents are confusing and hard to locate. (I’ll also give you pagecounts, as printed to U.S. letter–sized PDF from Safari with unchanged defaults, as well as wordcounts without markup.) I printed and read all three of these documents for this article.
- Web Content Accessibility Guidelines 2.0 is the actual root document and is the only one that is “normative,” i.e., a standard. It’s described, in W3C parlance, as a Last Call Working Draft. (72 pages, 20,800 words)
- Understanding WCAG 2.0 is a document that purports to explain WCAG 2. (165 pages, 51,000 words)
- Techniques for WCAG 2.0 provides “general” techniques. (221 pages, 88,000 words)
When compared against typical page dimensions in books, the three WCAG 2 documents, at 450 pages, exceed the size of each of the books published on the topic of WCAG 1, including mine. Additionally, according to many blog reports (Snook, Clagnut, Sitepoint), Shawn Lawton Henry of the WAI Education & Outreach Working Group cautioned attendees at her South by Southwest 2006 presentation to read only the Understanding document, not the actual spec. Since the Understanding document is more than double the size of what it purports to explain, this itself may indicate a problem with WCAG 2.
There’s a separate document, not updated since November 2005, covering HTML techniques. It isn’t included in this article. Also, “guidelines” in WCAG 1 are now called “success criteria” in WCAG 2, a change in nomenclature I will ignore.
In the discussion below, links to and within these documents were difficult to finesse, given their numerous, but still insufficient, fragment identifiers. In some cases—paging Steve Faulkner!—no sensible
title attribute was apparent.
You don’t have a lot of time to comment#section4
After working on WCAG 2 for five years, WAI gave the entire industry and all interested parties, including people with disabilities, a whopping 34 days to comment on WCAG 2 (until 31 May 2006). While that is in excess of the suggested three-week minimum, it isn’t long enough. The Working Group, moreover, would like you to fill out a form, possibly using Excel, for each and every issue you disagree with.
I advise you to simply send mail to email@example.com and read the archives of that mailing list (where it’s impossible to tell exactly who submitted what comment via the WAI form). There’s a lengthy omnibus list of comments received via the WAI form. I also advise people to petition for at least another month’s commenting time, quoting W3C process back to them (viz., comment periods “may last longer if the technical report is complex or has significant external dependencies”).
The process stinks#section5
And now a word about process, which you have have to appreciate in order to understand the result. The Web Content Accessibility Guidelines Working Group is the worst committee, group, company, or organization I’ve ever worked with. Several of my friends and I were variously ignored; threatened with ejection from the group or actually ejected; and actively harassed. The process is stacked in favour of multinationals with expense accounts who can afford to talk on the phone for two hours a week and jet to world capitals for meetings.
The WCAG development process is inaccessible to anyone who doesn’t speak English. More importantly, it’s inaccessible to some people with disabilities, notably anyone with a reading disability (who must wade through ill-written standards documents and e-mails—there’s already been a complaint) and anyone who’s deaf (who must listen to conference calls). Almost nobody with a learning disability or hearing impairment contributes to the process—because, in practical terms, they can’t.
What WAI is supposed to be doing is improving the web for people with disabilities. Something’s wrong if many participants work in a climate of fear, as they tell me they do. I never hear of similar complaints from WAI’s other groups. WCAG Working Group is a rogue element within the W3C, one that chair Tim Berners-Lee must urgently bring to heel.
The process is broken, so let’s not be surprised that the result of that process is broken, too.
Less of a travesty, but still a failure#section6
If you ever set aside two hours of your life to read a previous “draft” of WCAG 2, you were probably baffled and/or infuriated. The Working Group has been effective at improving minor guidelines and has excelled at making the whole document seem eminently reasonable. They’ve succeeded spectacularly at burying the lede—hiding the nub of the guidelines deep within the document. They’ve done a beautiful job at making WCAG 2 look like it will actually work. It won’t.
Based on the three documents I read, taking into account both required and suggested practices, let me explain what WCAG really says:
- Exactly what a “page” is, let alone a “site,” will be a matter of dispute.
- A future website that complies with WCAG 2 won’t need valid HTML—at all, ever. (More on that later.) You will, however, have to check the DOM outputs of your site in multiple browsers and prove they’re identical.
- You can still use tables for layout. (And not just a table—tables for layout, plural.)
- Your page, or any part of it, may blink for up to three seconds. Parts of it may not, however, “flash.”
- You’ll be able to define entire technologies as a “baseline,” meaning anyone without that technology has little, if any, recourse to complain that your site is inaccessible to them.
- You’ll be able to define entire directories of your site as off-limits to accessibility (including, in WCAG 2’s own example, all your freestanding videos).
- If you wish to claim WCAG 2 compliance, you must publish a checklist of declarations more reminiscent of a forced confession than any of the accessibility policies typically found today.
- Not that anybody ever made them accessible, but if you post videos online, you no longer have to provide audio descriptions for the blind at the lowest “conformance” level. And only prerecorded videos require captions at that level.
- Your podcasts may have to be remixed so that dialogue is 20 decibels louder than lengthy background noise. (You don’t have to caption or transcribe them, since they aren’t “multimedia” anymore. However, slideshows are now officially deemed to be “video,” which will come as a surprise to Flickr users.)
- You can put a few hundred navigation links on a single page and do nothing more, but if you have two pages together that have three navigation links each, you must provide a way to skip navigation.
- You can’t use offscreen positioning to add labels (e.g., to forms) that only some people, like users of assistive technology, can perceive. Everybody has to see them.
- CSS layouts, particularly those with absolutely-positioned elements that are removed from the document flow, may simply be prohibited at the highest level. In fact, source order must match presentation order even at the lowest level.
- Also at the highest level, you have to provide a way to find all of the following:
- Definitions of idioms and “jargon”
- Expansion of acronyms
- Pronunciations of some words
- You also have to provide an alternate document if a reader with a “lower secondary education level” couldn’t understand your main document. (In fact, WCAG 2 repeatedly proposes maintaining separate accessible and inaccessible pages. In some cases, you don’t necessarily have to improve your inaccessible pages as long as you produce another page.)
Since these three documents are “drafts,” of course all the above can change. But really, it won’t. A Last Call Working Draft is viewed as substantially complete. It is “a signal that… the Working Group believes that it has satisfied its relevant technical requirements has satisfied significant dependencies with other groups.” The WCAG Working Group is not going to budge on major issues at this point.
It’s the definitions that sink it#section7
While WCAG 2 calls for all manner of unrealistic and unproven features, those are not what’s going to sink the guidelines. Something as mundane as definitions will take care of that.
WCAG 1 was strongly HTML-specific. Everybody recognized that as a problem in an age when formats that blind people love to hate, like PDF and Flash, are slowly becoming accessible. So WCAG 2 had to be technology-neutral.
Pop quiz: What do the following terms, given with their official WCAG 2 definitions, really mean?
Can you translate any of these terms into words that every reader of this article understands, like “page,” “site,” “valid,” “well-formed,” or “template”? Well, I can’t. Amid all these definitions, where are the templates we use to create sites composed of valid, well-formed pages?
If you’re a standardista working on accessible websites today, are you actually, without even knowing it, an author authoring authored units to be used in authored components in programmatically-determined web units that can be parsed unambiguously?
Take a look at WCAG 2 and you’ll come up with your own checklist of malapropisms and incomprehensible passages. In fact, so much of WCAG 2 is so hard to understand, and almost impossible to apply to real-world websites, that WCAG 2 is no better than its predecessor in one respect—both documents flunk their own guidelines for clear and simple writing.
If you can’t understand the basics of a guideline, and if WCAG 2 in general is so aloof from the real web that it can’t even bother to use words that working developers understand, are you realistically going to be able to implement WCAG 2 on your site? Remember, you cannot officially fall back on the Techniques and Understanding documents for added information. Only the WCAG 2 document itself is “normative.” You sink or swim based solely on that.
And if you have trouble understanding WCAG, does this not imply that someone could come along with a different interpretation and accuse you of violating WCAG, and, by implication, producing an inaccessible site? Since that’s illegal in some parts of the world, a certain degree of clarity is essential, but clarity is something you do not get in WCAG 2.
If you slog through WCAG 2, you’ll notice that even something as deceptively simple as that WCAG 1 guideline on clear and simple writing isn’t there. Nor is there anything actually stronger than that guideline. In fact, there’s nothing at all along those lines to be found in WCAG 2’s Principle 3, “Content and controls must be understandable.”
You do, however, have to take fanatical care to mark up foreign-language passages, idioms, and the like, and if your content “requires reading ability more advanced than the lower secondary education level,” you have to provide “supplementary content” that doesn’t require that reading level. If you’re a learning-disabled person, that’s pretty much all WCAG 2 is willing to do for you.
Based on my analysis and on presentations by Gian Sampson-Wild, it seems that dyslexics and others with cognitive disabilities have been sacrificed on the altar of testing. As WCAG 2 tells us:
“High inter-rater reliability” is not defined. Does it mean eight out of ten people? Six? All ten?
It seems that everybody assumed it would be easy to find “people who understand WCAG 2.0” yet who also disagree that a certain segment of content is clearly and simply written. I assume it was taken as axiomatic that tests of content would seldom achieve “high inter-rater reliability,” which relies on messy human opinion. The Working Group was and is unreasonably fixated on automated testing, in part due to the presence on the Working Group of authors of automated testing applications and algorithms. The group was able to stomach the reality that, for example,
alt texts can be evaluated only by humans, but was unwilling to accept that the same applies to “content” generally.
It is harsh but fair to observe that WCAG 2 sells out people with learning disabilities so that a tool like Bobby, or a competing or successor tool, can test a larger number of criteria with a higher success rate.
The creative fiction of multiple levels#section9
WCAG 1 had three levels of “conformance,” which, in typical WAI style, were given a total of six names—Priority 1/Level A, Priority 2/Level AA (annoyingly written as “Double-A” to get around faulty screen-reader pronunciation), and Priority 3/Level AAA (“Triple-A”). Standardistas eventually figured out that Priorities 1 and 2 were what you really needed to make an accessible website; Priority 3 was strictly optional (also onerous and impossible to meet in principle). Even some governments, like Canada’s, require Priority 2 compliance for their own sites, though it is not necessarily achieved.
When experts carry out evaluations of websites against WCAG 1, most of the time they consider the first two priority levels. Few, if any, sites pass Priority 3 evaluation; the Disability Rights Commission and Nomensa found that no sites tested met Priority 3.
To a rational observer, all this means that Priorities 1 and 2 in WCAG 1 are really a single set of rules and Priority 3 is irrelevant and unattainable. Getting this idea through the heads of the Working Group (or rather, through the head of one of the cochairs) was impossible, so in WCAG 2 we’re still stuck with three levels. But get this: All levels are deemed important.
To translate: We poor saps misunderstood WCAG 1’s priority levels to be real priority levels. WCAG 2 considers all of its guidelines “essential for some people,” though they’re still broken up into three levels. But actually, if you look closely at the WAI documents:
- Even if you comply with all three levels in WCAG 2, you may still end up with an inaccessible site.
- You never have to comply with more than half of the Level 3 guidelines.
- The WCAG 2 document itself baldly states that “It is not recommended that Triple-A conformance ever be required for entire sites.”
- In a circular contradiction, Guideline 4.2.4, at Level 3, doesn’t even require you to meet Level 3 in some cases.
Which level would you like to conform to? Please make your selection now.
In a further absurdity, the Working Group couldn’t even finesse its guidelines to apply to all levels. Some guidelines don’t even manifest themselves at Level 1, the lowest level. I did a count:
- Levels 1 + 2 + 3: 7 guidelines
- No Level 1: 1 guideline
- No Level 2: 2 guidelines
- No Level 3: 1 guideline
- Level 1 only: 2 guidelines
- (Level 2 only or Level 3 only: Nil)
It’s as if web standards never existed#section10
While people like you and me were labouring in the trenches since approximately 1998 to improve web standards—improve support in browsers, improve understanding among authors, improve the basic task of explaining standards—the WCAG Working Group has been off in its parallel universe cooking up guidelines that apply equally ambiguously to everything. But the Working Group certainly did take the time to exterminate some accepted concepts.
Yes, we know already: A site with valid HTML is not automatically accessible. We’ve got a couple of fun little example pages to look at (by Gez Lemon and Bruce Lawson). But that’s all they are—examples. In the real world of clueless tag-soup developers, the growing minority who understand valid HTML are an elite who also understand accessibility. They understand which accessibility features you get for free with valid HTML (like
alt texts, which—yes, we know already—have to be written correctly). These developers take the time to include the remaining accessibility features anyway.
They also understand that tag soup produces unpredictable results in browsers and in screen readers. They know that a single unencoded ampersand, or omitted semicolon, or stray Unicode character on a page may knock it into the land of invalid HTML, but those are trifling examples not found in tag-soup sites like Amazon and eBay. (They know that Amazon and eBay are successful despite their source code.) They know that validity is a fragile thing that indeed can be blown out of the water by something as simple as a character like an é, an
(sic), or an
& in the wrong place. They know all that.
Nonetheless, valid HTML was a second-level requirement in WCAG 1. You almost never find it in a commercial site—Nomensa’s recent survey, which found four examples out of 99 sites it manually checked, is the highest I’ve ever seen. But, as a requirement, it warned developers that, while tag soup is the norm, it is not what we want.
WCAG 2 upends that apple cart completely. You never have to have valid HTML in WCAG 2–compliant sites. All that’s required is that the page be parsed unambiguously (Guideline 4.1—a Level 1 guideline with no Level 2 or 3 guidelines). This is supposed to mean “no improperly-nested elements,” but you’d never know that from the term itself.
In an HTML page, you can keep right on using all the misplaced stray characters you want, but you can’t nest
<p>. You do not have to use any elements or attributes required by the specification. You do not have to use elements according to specification. All this spells trouble for the case of forms, an area of constant annoyance for screen-reader users. A document made up of nothing but
spans, if unambiguously parsable, passes WCAG 2 free and clear.
XHTML pages, according to spec, are supposed to stop dead in their tracks at the first ill-formed content, but we know they do not do so in the real world, where XHTML is treated like a kind of HTML with added closing slashes
(save for the tiny few perfectionists who serve XHTML as XML). So in fact this requirement gives XHTML the same pass it gives HTML.
Does any of that really solve the problem? Or does it have enough of an appearance of solving the problem that it could be voted into existence by Working Group members from companies like IBM, Oracle, and SAP, whose software cannot reliably produce genuine valid HTML? (IBM has been actively promoting a DHTML accessibility technique that breaks the HTML spec. Oddly, and futilely, the Techniques document discourages such a thing.)
Do you think WCAG 2’s guideline is good enough to improve the practices of tag-soup developers? Even if valid HTML everywhere all the time is unattainable, the fact remains that, in 2006, we have never had more developers who understand the concept and are trying to make it real on their own sites. WCAG 2 undoes a requirement that, were it retained, could be perfectly timed now.
Captioning and audio description for multimedia#section11
If there’s any area in which the application of WCAG 1 was a total failure, it’s multimedia. People have been quite happy to ignore the requirements for captions (for the deaf) and audio descriptions (additional narration for the blind), both of which were required at the lowest accessibility level. (Actually, it was worse than that from a deaf person’s perspective. You could get by just with a transcript, not actual captions.)
Captioning and description simply are not found in the wild. When there’s any access at all, it’s through captioning. In this way, online multimedia follows TV, home video, and cinema in major democracies, where captioning is common and description isn’t. (Who can forget the irony of AOL’s head of accessibility, a blind man, announcing captioning on “select” AOL videos, but no audio description at all?)
For a deaf or a blind person who wants to understand multimedia, WCAG 2 offers no real improvement. The transcript-only loophole has been closed, and captions remain a requirement at the lowest level for prerecorded video. But instead of audio description, you can get by with a figment of the Working Group’s imagination called a “full multimedia text alternative including any interaction”. A discredited holdover from WCAG 1, it’s apparently a combination of transcript of dialogue and sound effects (which blind people don’t need), transcript of audio descriptions (which deaf people don’t need), and links to any interactive components in the video.
The whole thing is supposed to be of help to deaf-blind people, who were never surveyed for their preferences, an action I recommended to WAI at a face-to-face meeting in 2003. Nor was any user testing carried out. (That’s all I can tell from published evidence, anyway. I sent e-mail inquiries to deaf-blind organizations in several countries asking if they’d been surveyed or had any opinions, with no response.)
There are about three known examples of such a transcript in the seven-year history of WCAG (e.g., DigNubia). And there really aren’t any HTML semantics for such transcripts, unless you wanted to push the envelope of the definition list (a banned usage in “HTML 5”).
At the next-to-lowest compliance level, suddenly real audio descriptions are required and, again suddenly, live video must be captioned. Go one step higher and you have to translate your video into sign language (which one?) and provide that same imaginary transcript, among other things. You never have to describe live video.
And while I’ve never been a proponent of requiring the hundreds of live online radio stations to caption themselves, certainly prerecorded podcasts are an obvious source of inaccessible multimedia. But actually, multimedia is defined as “audio or video synchronized with another type of media and/or with time-based interactive components.” Your MP3 podcast isn’t synchronized with anything, so it’s exempt. This requirement will satisfy the majority of podcasters who ever even bothered to think about accessibility, pretty much all of whom decided it was too much trouble even if they liked the idea or worked for WAI at the time. The requirement will also ensure that the status quo of inaccessible podcasting remains untouched.
That’s enough for one article, I think. But that isn’t the end of my comments on WCAG 2; you can check my website for ongoing additions. This article’s comments section, and the tag WCAG2, are other ways to comment.
Announcing the WCAG Samurai#section13
WCAG 2 is not too broken to fix, but we have no reason to think the WCAG Working Group will actually fix it. The Working Group is too compromised by corporate interests, too wedded to the conclusions we see in the current “draft,” too broken in general. What you see in WCAG 2 now is pretty much what you’re gonna get—permanently.
As such, WCAG 2 will be unusable by real-world developers, especially standards-compliant developers. It is too vague and counterfactual to be a reliable basis for government regulation. It leaves too many loopholes for developers on the hunt for them. WCAG 2 is a failure, and not even a noble one at that.
If this is what we get when WAI tries to rewrite WCAG from scratch, maybe there’s another option. WCAG 2 does not “replace” WCAG 1 any more than XHTML “replaced” HTML. Maybe all we really need to do is to fix the errata in WCAG 1. It’s been discussed before, but a WCAG 1.0 Second Edition or a WCAG 1.1 never happened.
Now, though, I can announce that such errata really are going to be published, and my friends and I are going to do the publishing. After the manner of Zeldman’s CSS Samurai posse, which put CSS layouts on the map for browser makers and developers, the WCAG Samurai will publish errata for, and extensions to, existing accessibility specifications.
Of course we aren’t going to infringe anybody’s copyright, but another thing we’re not going to do is run a totally open process. It’s a viable model for standards development, one I have championed in another context, but in web accessibility it is proven not to work. Membership in WCAG Samurai, as in CSS Samurai, will be by invitation only. If we want you, you’ll hear from us.
Of course this is unfair to say the least, if not actively elitist and hypocritical. Call it as you see it. But this is what we’re going to try in the hopes of getting the job done, which WAI and its model have simply failed to do.
100 Reader Comments
Great article, if a little worrying for developers.
Over the last couple of years, it seems to me the web development industry has really grown up and found its feet in terms of working to a common set of goals. We finally had a much needed set of “standards” to keep us in check and a yardstick to determine the good pages from the bad. The WCAG 1.0, as imperfect as they were, seemed to support and co-exist with developing standards compliant code (xhtml, css).
It now seems as though this could be completely upturned by the release of WCAG 2.0. The last thing we need as an industry is to revert back into turmoil and kludge, but it seems as though this is what is being promoted.
I will be keeping a close eye on how WCAG 2.0 develops and will be sure to check out your pages on the subject.
Thanks for the wake up call!
Thankyou, Joe, for your succinct view of a complex problem.
It seems crazy that we should be seeing guidelines released by the WAI as a problem, but I guess things should be called as they are seen.
WCAG2.0 seems like a massive leap backwards and sounds like it can do nothing but hurt the community of web designers that are determined to ensure that content is accessible to the largest possible audience.
This news has put a big dent in my trust of the WAI to champion the needs of accessible users. As such, I believe you are right to begin to strive for a better way based on sound judgement based on the needs of real users instead of standards that pander to the needs of corporate members.
I wish you and your Samurai band the very best of luck – “Banzai”.
I last read the working draft documents a couple of years back, but they certainly weren’t as weighty as you described in your article, so I can see that a lot has changed since then! When I read them, they appeared to be a re-ordering of the existing WCAG 1.0 guidelines, but now they have indeed changed most of it beyond recognition.
Just for a bit of flavour, I’ve attempted to read the new drafts, and the first thing that came to mind was “erm…” and a slight state of confusion. I can’t see how on Earth anyone is meant to implement these guidelines any more.
Hopefully, and with a lot of luck, the working group might pull their fingers out and produce a document that is both achievable by the web author *and* provides guidelines that will actually help those it was intended to.
I can’t see that myself, but we can only hope…
Joe Clark has written another diatribe against WCAG 2.0, or is it against the WCAG Working Group? Here are some comments on half-truths and unsupported statements in his article.
Joe Clark writes: “The problem here is that standardistas already knew what to do to cover the same territory as those low-priority guidelines.”
He ignores that WCAG is not just for today’s standardistas, so those ‘low-priority guidelines’ are not superfluous.
He also writes: “When compared against typical page dimensions in books, the three WCAG 2 documents, at 450 pages, exceed the size of each of the books published on the topic of WCAG 1.”
He ignores that for each success criterion, “Understanding WCAG 2.0” repeats the text of the success criterion and each of the glossary entries used by the success criterion; Joe Clark does not take that into account.
Mr. Clark writes: “There’s a separate document, not updated since November 2005, covering HTML techniques. It isn’t included in this article.”
The Working Group reviewed the 13 techniques in http://www.w3.org/TR/WCAG20-HTML-TECHS/ when writing the new techniques document; Mr. Clark ignores the fact that the new techniques document contains 49 HTML techniques (and many general techniques that are also applicable to HTML).
Joe Clark writes: “The Working Group, moreover, would like you to fill out a form, possibly using Excel, for each and every issue you disagree with. I advise you to simply send mail to firstname.lastname@example.org (…)”
The form is meant to enable faster processing of the comments, but Joe Clark doesn’t care about that.
He also writes: “The Web Content Accessibility Guidelines Working Group is the worst committee, group, company, or organization I’ve ever worked with. Several of my friends and I were variously ignored; threatened with ejection from the group or actually ejected; and actively harassed. The process is stacked in favour of multinationals with expense accounts who can afford to talk on the phone for two hours a week and jet to world capitals for meetings.”
These are accusations that readers of the article cannot check. Is this backstabbing for being removed from the Working Group (“actually ejected”)?
He continues: “Something’s wrong if many participants work in a climate of fear, as they tell me they do.”
Again, statements that no reader can check.
He writes: “You’ll be able to define entire technologies as a ‘baseline,’ meaning anyone without that technology has little, if any, recourse to complain that your site is inaccessible to them.”
Joe Clark used to accuse the WCAG WG of hating anything that goes beyond plain black-and-white HTML pages; now he accuses WCAG of allowing too much. He also ignores that baselines are not necessarily set by site developers, and that an unreasonable baseline is a valid reason for a complaint.
“The Working Group was and is unreasonably fixated on automated testing (…)”
No, it avoids success criteria that rely on judgement calls.
Great article Joe. Big! I’ve only skimmed and clicked a few links, but I like what I’ve read so far and it’s good to know that someone has taken the time to dig deep and comment on this monster.
I’ll read it properly this evening, but one thing I picked up on is point 12 under “Less of a travesty, but still a failure”. You say…
“CSS layouts, particularly those with absolutely-positioned elements that are removed from the document flow, may simply be prohibited at the highest level. In fact, source order must match presentation order even at the lowest level.”
When I read the link you provided with this statement (http://www.w3.org/TR/WCAG20-TECHS/#N100C7) I interpret it differently. To me it says ‘mark up your code semantically’. ‘Don’t use CSS to change the semantic meaning of your content’, which I think is a good thing and markedly different ton your interpretation of ‘CSS layouts.. may simply be prohibited…’ and ‘source order must match presentation order’.
The example given in the Techniques for WCAG 2.0 concerns the restyling of an unordered list to resemble a table of 2 columns with a heading at the top of each column. It explains that without the CSS, the intended meaning of the code os lost. It’s talking semantics, and not page flow.
But I think this only goes to reinforce your comment about the guidelines being open to misinterpretation. We’ve both read the same paragraph and taken two completely different meanings from it.
Thanks for stirring up my interest in a guideline that I’ve been putting off reading for months now.
Just wanted to say I enjoyed your article. Although I don’t agree with all your conclusions, I must say I’m impressed that you were able to summarize many of the expressed concerns regarding WCAG 2.0 in an understandable, human way.
I believe WCAG 2.0 is a good framework. The challenge is that it is outcome-based, not method-determined like WCAG 1.0. In order to have an outcome-based standard, it is necessary to have two things:
1) a set rigidly defined definitions (as you aptly point out), and
2) metrics for compliance – meaning you actually must measure wheather each outcome is met, and to what degree – regardless of the method or technology used.
It’s clear you believe WCAG 2.0 does not have these attributes. The WCAG WG believes it does. Like you, I believe the market will ultimately decide the issue.
-Hon. Mark D. Urban, Chair
North Carolina Governor’s Advocacy Council for Persons with Disabilities.
Clearly you don’t agree with this article, which is fair enough. However, despite picking up on a few points in the article, I notice that you don’t actually refute the central argument that WCAG 2.0 fails in it’s aims and objectives, and doesn’t meet the needs of developers or those with disabilities.
Personally I find this to be a particularly frustrating situation, as the likes of the WCAG Working Group are supposed to be _promoting_ and _encouraging_ the development of accessible websites. Instead they appear to be hindering those that wish to “do the right thing”, and supplying those who couldn’t care less with ample reason to do nothing.
I spent a fair propotion of the article wondering whether it would be better, as a community, to turn our backs on WCAG 2.0 and produce an pragmatic, useable alternative ourselves. As such, the closing remarks about the WCAG Samurai were most welcomed.
Thank you Joe, and I look forward to seeing the fruits of the Samurai’s labours.
Oh, Christophe. Surely you understand the concept of confidential sources. If you need a public source, Gian Sampson-Wild has given presentations in Australia explaining just how the Working Group functions and how it made her feel. But perhaps it says something that you are an esteemed member in good standing of the Working Group yet are unaware of how fearful many of your colleagues are.
Do please give us an updated page- and wordcount of the three WCAG documents without the repetitions you mention.
The “faster processing of comments”? enabled by WAI’s form produces mailing-list messages with identical subject lines and senders. This would be “faster”? how, exactly?
I admire Christophe’s valiant defence of the Working Group. I note he reuses the Working Group’s typical tactic of accusing its harshest critic of sour grapes. I have information Christophe doesn’t, and the reality differs from what he believes.
Now, what else did I miss, and does any of that conclusively disprove the main points of the article?
I read this with interest, because I would like to be more accessible with what I build, even though my projects rarely if ever require it. Based on Joe’s review, I can’t imagine slogging through all that stuff.
There’s nothing I hate more than going to the W3C site to look something up. Having watched them since their inception, I can’t believe an organization can move so slow. While their intentions are good, they seem ineffective at best as it relates to a web author who has to be in the trenches every day.
Think about it – where do you get all your real info when you need it? It’s not the W3C – at least not for me.
What do we as developers need? We need a *clear and concise* set of guidelines to follow in order to make things as accessible as possible. And nothing the W3C or WCAG seems to produce is either one of those – not unless you are some kind of uber geek – which I apparently am not.
I suppose we can hope that the usual suspects will take the mountain of documents the WCAG produced here and make some sort of sense of it for us. But that shouldn’t be necessary – shouldn’t that be the job of the WCAG in the first place?
The toughest thing for developers in the trenches is to bring both standards and accessiblity to projects where everything is evaluated on how much *business* sense it makes. I have been fighting for standards for years, and it’s still tough. People still don’t get it – and trust me, accessibility isn’t *even on their radar screens yet.*
This doesn’t help. If WE can’t get our you know what together, how can we present it to those we need to? Well, we can’t.
As for Joe vs. the WCAG. When I founded http://www.maccaws.org in 2002, Joe graciously joined up and wanted to help. We butted heads a bit (and half of it was my fault, trust me). I only bring this up to say to you Joe – recognize that you can come on *very* strong and aggressively – and that hurts your cause, regardless of what your professional credentials are.
Even though I was half in the wrong, things were so tense between us that the relationship could not continue. I just have to wonder if something similar happened here. I think that especially on the internet where you communicate in so many forms other than face to face, that courtesy and respect are paramount. It needs to be given to be received.
And I also realize having directed Maccaws.org for a year, that getting things done with people all over the place (who have full time jobs as well) on such huge issues is difficult. So I do cut the W3C some slack – but only some.
If all we have to work with are old guidelines that are understandable but out of date, or new ones that are supposedly up to date but not understandable, how can we push accessiblity forward?
In the end, it was the web development community as a whole that really did the heavy lifting in moving standards awareness forward. They did it via pushing it on projects (Like Doug Bowman on the famous Wired redesign) and grass roots evangelism. As a group, we just decided we were going to use standards, and we did – and we do – and it caught on.
It’s going to have to be the same way with accessiblity. It’s not going to get done any other way, and that means there is lot of work in blogs and projects to be done, and it’s going to take a long time, just like it did for standards. That’s the way I see it. Ramble mode off.
You know, I like Joe, really! Oh, we’ve tussled in the past (via emails), but he’s a straight shooter and tells it like it is, even if he takes little care to avoid rubbing it in people’s noses. But that’s Joe – love him or leave him.
But as far as working with the W3C, I must step up and comment, in particular to the point raised by Mr. Strobbe, who wrote,
bq. “These are accusations that readers of the article cannot check”.
I can attest to knowing a regular participant to the WG discussion list who has been shut down and ignored on more than one occaison, and I personally have been dismissed by other working groups within the W3C (for me, it was the XHTML 2 authors, who directly contravened the W3C published dispute mechanism – however that’s another story for another day, but you can start here – “www.wats.ca/show.php?contentid=47”:http://www.wats.ca/show.php?contentid=47 ). So the behavior and treatment described by Joe is not unknown any time you strongly voice an opinion counter to the internal W3C herd.
As a long time “web accessibility” guy myself, I have to concur with Joe – the WCAG 2 is unworkable, fatally flawed, and will never receive the required up-take within the developer community required to make it useful. Oh, and it does nothing to improve web accessibility – something Joe has amply illustrated. I may not always agree with Joe’s “style”, but this time he’s bang on the money!
I too lend my support and encouragement to the WCAG Samurai (although, Joe, I hope that there will be some form of public vetting at some point in time).
Thanks for the update Joe. I had high hopes for an improvement over WCAG 1 — but it doesn’t look like we’ll see it from WCAG 2.
It seems inevitable that if WCAG 2 isn’t useful and relevant, there will be other groups publishing their own guidelines. In particular, Section 508 in the US and it’s equivalents in other parts of the world may have been based on the WCAG, but it’s possible they’ll continue to develop in a positive direction.
If Web accessibility boils down to one person’s style, then let’s all pack up and go home right now.
If it boils down to _other people’s accusations_ of one person’s style, let’s get off the Web altogether.
Memo to Mr. Strobbe:
Web Accessibility is *ALL ABOUT* judgment calls – judging/understanding the semantic logic of your text, what is the appropriate alt text for your image (not as easy as it seems), knowing when (and when not) to use tables and lists, knowing when and how to use appropriate headers, etc. etc. etc. Web Accessibility is about logic, reason, understanding of different perspectives – it takes a human brain to do it right, and thus by it’s very nature interpretive and subjective. Web Accessibility is *NOT* about getting a bunch of check marks on a spreadsheet and a little badge that you can paste on your website at the end – automated checking tools to the contrary.
Your one response to this point seems to succinctly illustrates how much the WCAG 2 *doesn’t* get it…
bq. You’ll be able to define entire technologies as a “baseline,”? meaning anyone without that technology has little, if any, recourse to complain that your site is inaccessible to them.
I don’t fully agree with this, and while I do see the problem with the baseline concept, I don’t believe it is as bad as you make it out to be.
There a major difference between the statements: “This site is accessible” and “This site is accessible to users, whose UAs meet the baseline requirements”. While the latter will theoretically apply to any document conforming to WCAG 2.0, the former is a much broader statment and, as it does not define exactly who the site is accessible to, one can only assume it means “This site is accessible [to everyone]”.
If the baseline is set too high, users will have “recourse to complain that your site is inaccessible to them”. The problem is that there is no guidelines on specifying a realistic, accessible baseline and that means the concerns about it being set too high by organisations are indeed valid.
bq. You can’t use offscreen positioning to add labels (e.g., to forms) that only some people, like users of assistive technology, can perceive. Everybody has to see them.
I seem to be interpreting the specification differently from you, with regards to this issue. WCAG 2.0 states:
bq. The intent of this success criterion is to ensure that information and relationships that are implied by visual or auditory formatting are preserved when the presentation format changes. […] The purpose of this success criterion is to ensure that when such relationships are perceivable to one set of users, those relationships can be made to be perceivable to all.
Not everybody has to see the text labels, there is nothing there that says they can’t be hidden off screen. What I believe it is saying is that the same meaning needs to be conveyed to all users, regardless of the presentation.
E.g. For a visual user, the meaning may be conveyed through the visual layout, colours, icons, etc, but for an aural user, for example, the same meaning may be conveyed by speaking the text label.
bq. CSS layouts, particularly those with absolutely-positioned elements that are removed from the document flow, may simply be prohibited at the highest level. In fact, source order must match presentation order even at the lowest level.
Again, I seem to be interpreting the spec differently. Nothing in the spec says the presentation order much match the source order, it simply states that the same meaning must be conveyed to the user regardless of the presentation.
It’s ok to use absolute positioning (or any other layout method) to alter the presentational ordering, as long as the meaning of the content is not altered.
Regarding the WCAG Samurai, I’m a bit disappointed that the work will take place behind closed doors and urge you to reconsider the opennes of the development process. However, I think it’s only fair that you and the other invited members be given a fair chance to show that your model can and does work; and I’ll await the results of your work before I reach any conclusions on the issue.
Joe is right, it shouldn’t come down to anyone’s style (hopefully I didn’t give the impression I meant that, I don’t).
As a guy in the trenches (like the rest of us) I’ll tell you what I need. I need a reference that I can turn to to help me build accessible code. It has to be succinct and to the point. All the important stuff needs to be floating on the top.
If I want to dig, fine – that can be there too. But the stuff I need to get the job done needs to be front and center and *implementable* in the current state of things.
Can the WCAG deliver that? Apparently not. Who can deliver that so people like me can use it to make the web a better place? And deliver it so it makes business sense to silly companies that label themselves “web 2.0” (not just companies who are receiving government funds and are thus required to comply – and who don’t anyway) – so I can actually implement the stuff in what is becoming a more and more complicated client side experience.
To me, that’s the 64-dollar question.
I think that one area in which the WCAG Techniques document _is_ crystal clear is banning any difference between source and presentation order:
“Thus, it is important not to rely on CSS for a
visual-only layout which differs from the source code or
programmatically-determined reading order…. [In t]he following example… the text appears visually in the
browser in a different order than in the markup.”?
The fact that we don’t know what it really says is an issue. I think it also tells us that existing standards-compliant sites with good graphic design may be nonconforming, i.e, banned. I wonder if WAI’s recent site redesign would pass this criterion; anyone want to check?
bq. Â»the text appears visually in the browser in a different order than in the markupÂ«
… which means it isn’t limited to absolute positioning, but in fact includes (or rather excludes) Any Order Columns as described over at P.I.E.
As much as their is a supposed “climate of fear” in WCAG WG I’d contest that even in writing this comment I am fearful of Joey’s reply…if he’ll even stoop as low as me 😉
Look! Nobody set out to do bad work. For example, Microsoft did not set out to make our PCs hang and make programs bloated and ugly. So it is healthy that we have watchdogs.
WCAG WG is full of many dedicated people who are trying to do their best. Joey C here in his own way is also doing diliengent work to make sure we have good standardsan dhe should be commended for that.
Remember standards should be firm guidelines to be realistically followed not nasty rules to be obeyd [in my real world opinion].
Making the world a better place is a “bit by bit” process and not a “one fell swoop” process.
In the usability world we tend to make *testing* and the *inclusion of actual end users* a huge priority at ALL phases in guideline (and site) development. Perhaps that message could be taken up by another group so as to augement all the approaches to Web Content Accessibility?
[thanks for reading my view on this admirable work]
So, should e.g. be in an abbr element with a language of latin to expand to “exempli gratia”, or should it be replaced with “for example”? Needless to say, I’m a bit curious what the WCAG2’s stance on this would be.
“In the discussion below, links to and within these documents were difficult to finesse, given their numerous, but still insufficient, fragment identifiers. In some cases—paging Steve Faulkner!—no sensible title attribute was apparent.”
Are you and your invisible friends going to explain to us earthlings how the title attribute content can be made keyboard accessible in current user agents?
Apart from our personal spat about title attributes, I appreciate and agree with a lot of what you have to say here (not that my opinion counts for much).
Now I haven’t read it but I do trust Joe’s judgment and thats an entirely worrying article. At the end of the day, if we’re looking for improvement and widespread adoption of these standards into websites, shouldn’t we be making this much simpler and easier to achieve? not the other way round. Too many academics in ivory towers methinks.
Oh, I think it’s worse than Joe writes. As I see it, the WCAG 2.0 accomplishes two things: It makes it less likely that sites will ever really be accessible to persons with disabilities. And it makes the price of admission for an ostensibly accessible site (i.e. one that “meets” the guidelines) quite high.
Who will have the resources to build such sites? Answer: large corporations. Individuals and small businesses need not apply. Now all we need are laws to mandate the use of the WCAG 2.0 and big corporate interests can finally finish the job of wresting control of the Internet from ordinary folks.
And how clever! The end of democracy on the Internet brought about in the name of protecting the weakest members of society.
Having once served on the WCAG 2 committee, I doubt that many on the committee see their work this way. But just as journalists internalize the values of their corporate bosses and self-censor, the members of the WCAG 2.0 committee, under steady pressure from the big corporations who *are* the W3C, gave us a document that works to advantage big corporations over the rest of us.
Does this surprise anyone?
Many years ago (1997, I think), I wanted to join the W3C but discovered to my surprise (given Tim Berners-Lee’s talk about democracy) that there was no provision for individual membership. Small businesses (under, I think, $1 million gross per year) could join — for $5000 per year. Since that was about my yearly gross at the time, membership was impossible. Note that much of what the W3C does is behind closed doors, despite their claims of openness, so there’s good reason to want to be a member.
I wrote to Tim Berners-Lee and another member of the W3C’s board of directors asking about individual membership. Berners-Lee couldn’t be bothered to reply, but surprisingly the other board member did. I don’t remember his name, but I remember clearly what he said. He said that the W3C had considered individual memberships, but the big corporations upon which they were dependent for their financing (his words) didn’t want to dilute their power in the organization, so they’d decided against letting individuals join.
When I sat, briefly, on the committee back in 2001, we thought we were a matter of months away from a new version of the WCAG. Now, five years later, an unrecognizable document has finally emerged. I wish I could say I’m surprised, but I’m not.
The problem is not with the WCAG 2.0 Working Group. Certainly, there is plenty of backstabbing, grandstanding, dirty politics, whining, and worse on the committe, as there is on every committee in which humans participate, but there are also many very bright, very dedicated, and very well-meaning individuals on the committee. The problem is that the committee exists within the context of an industry consortium, not a democratic entity, in which one man who depends entirely on the largesse of big corporations for his salary has the final say on everything.
The W3C will never produce anything that doesn’t benefit big business. If the needs of big business and those of the common people coincide, then the W3C produces useful product. If those interests conflict, however, the W3C always has and always will protect the interests of its members over those of the public. Sadly, few people seem to understand this as the truth is carefully hidden behind talk of consensus and transparency.
We, the people, built the Internet. We ought to own it, and it should be operated for our benefit, not for the further enrichment of private interests.
Fat chance of that, though.
Now, Steve, I did not discuss keyboard accessibility of title attributes, so do please stay on topic. And surprisingly few of my friends are invisible. Certainly Zeldman has high visibility. Or at least his book covers do, noticeable as they are from a two-block radius.
One thing I forgot to mention above:
It’s funny how sometimes people can have all the right data, make all the right connections, and still manage to draw the wrong conclusion. In his section entitled, “The Process Stinks” Joe points out that the WCAG 2.0 Working Group has been anything but open. In fact, it’s an elitist group of so-called experts and it is highly resistant to input from anyone else. And Joe is exactly right in pointing out that the WG manifestly /does not practice what it preaches/ — persons with disabilities need not apply (as well as those who cannot afford two hour long distance phone calls or flying all over the world for face to face meetings).
So what does Joe conclude from the lack of openness in the WCAG 2.0 WG? That “[A totally open process] is a viable model for standards development . . . but in web accessibility it is proven not to work.” Um, come again? How exactly does the complete lack of a totally open process prove that a totally open process doesn’t work?
Joe’s solution, not surprisingly given his authoritarian temperament, is to create yet another elitist, closed, (and this time even secret) group to solve the problem. Apparently, in Joe’s world the ends really do justify the means.
Unfortunately, in the real world the ends and the means are one and the same. You are what you do, not what you are trying to accomplish. Joe’s WCAG Samurai will accomplish exactly what Joe is trying to avoid: they will legitimate the use of elitist, closed groups to create web standards. Then it will be the WCAG Samurai vs. the W3C WCAG WG, and the loser will be”¦
The rest of us.
Come on, Joe. Here is your chance to expose the W3C for what it really is. But to do that you have to give up your demagoguery and embrace a truly open process. How about it just once?
Charles, give me a break. While decisionmaking rests with the groups that can actually _vote_ in WCAG Working Group — W3C Members (note the majuscule), staff, Invited Experts, and the curiously-named participants in good standing — the mailing lists, calls, and face-to-face meetings are open to anyone. Nominally.
So they tried that and it didn’t work. We’re going to try something else. If _that_ doesn’t work, I’ll expect you to be the first to say so. And besides, we’re just writing _errata_, not a whole new set o’ guidelines.
Where decision-making rests is more than a little important. And as for the “nominally” open process, for evidence of just how open a process the WG is, I refer readers to this writer who sums it up quite nicely:
And if it’s just “errata” that the WCAG Samurai will be writing, why the secrecy? Why the need to hide the process? Why not have the discussion right out in the open so that everyone knows who’s involved and what’s being discussed?
Also, it seems to me that the whole point of these “errata” is to kick the legs out from under the WCAG 2.0, so effectively they *are* determining the standard, if the Samurai are successful.
The logic of Joe’s argument escapes me. Wasn’t the problem with the Working Group that they *wouldn’t* listen? So how is this rectified by creating a new group that *won’t* listen? Or is the point that this group is controlled by *Joe*? Is that the secret to successful standards? Ask Joe?
What’s funny about all this (or what would be funny if it weren’t so anti-democratic) is that I think that Joe is probably right about the WCAG 2.0. But somehow a very perceptive critique segued to a very bad solution.
Open, fair, and transparent processes in standards-making are not optional elements to be discarded whenever inconvenient. That’s like saying “We had to destroy the village to save it.” It just doesn’t work that way.
Given that Joe hasn’t outlined the way that the proposed ‘Samurai’ group will work, I think ascribing secret and anti-democratic principles to his suggested solution is tilting at windmills.
One would hope that the Samurai, secretive though they may be, will still solicit public opinion; if they do not, you can be sure there will be a backlash – but let’s see how it pans out first, no?
While not all of us follow what the W3C(World Wide Web Consortium) is up to, I’d just like to give Mr. Clark thanks for bringing up the issue to the uninformed masses who keep tabs on ALA(A List Apart) more than they do on W3C(World Wide Web Consortium).
I’d always thought of the W3C(World Wide Web Consortium) as a bit bureaucratic, but this takes the cake”¦
Good luck with the errata–if nothing else, it may give the W3C(World Wide Web Consortium) a better idea of how many people are against the new specs.
“Empty summary attributes are acceptable on layout tables, but not recommended.”
So they *recommend* not to use a W3C recommendation? Yep. Confusing.
“Empty summary attributes are acceptable on layout tables, but not recommended.”
So they *recommend* not to use a W3C recommendation? Yep. Confusing.
I last read through WCAG about 12 months ago, and I had hoped that they might have been distilled into more sense by now, but it seems not.
The scripting guidelines were by biggest concern before; at the time, the only work I could use to describe them is “nonsense”. Looking at the revised guidelines .. well they’re better, but (at a quick summary) they’re still dominated by tips and practical suggestions which:
1 – bely a lack of practical experience of scripting with accessibility in mind
2 – have not been properly tested aside from a narrow range of the verry latest devices
3 – pass on the same bad advice and misconceptions as WCAG (eg event-pairing, which is utterly and demonstrably wrong)
I shall compile and publish a more detailed breakdown when I have a bit more time.
We’re writing errata for WCAG 1.0, not 2.0 (as yet). Sorry, Charles: I understand your point and simply disagree with it. Let me say it one more time. WCAG tried one way and it didn’t work. We’re going to try another way in the hopes that _it_ will work. You don’t have to like it. Perhaps you will like the results, though.
I think it’s great that Joe is going to write some errata for the WCAG recommendations and make that available to the rest of us. If he wants to do that with a private list of people he wants to work with – great. If he wants to do it all by himself -great.
That doesn’t stop anyone else from getting together with their colleagues, writing about accessibility, and sharing it with others. If enough people agree with what you’ve written, it will gain traction. If it’s not very good (like Joe suggests of WCAG 2) people will keep looking for alternatives.
If you don’t like the WCAG process, and you don’t like Joe’s process… make your own.
I work in the Mass.Gov Office for the Commonwealth of Massachusetts. We have spent a lot of time and effort over the years educating and cajoling state agencies about web accessibility, which is required by federal and state laws. We relied heavily on WCAG 1 and 508 in drafting our standards, but are concerned that WCAG2 won’t be much help to us in the future.
My immediate concern, however, is that we won’t be able finish absorbing all this content in time to file comments. You advise that we ‘petition for at least another month’s commenting time, quoting W3C process back to them (viz., comment periods “may last longer if the technical report is complex or has significant external dependencies”).’ How exactly do we do such a thing? We have attorneys in the Mass. Office on Disabilities who need to know, and I’m having no luck finding guidance on the W3C site.
I don’t have time to understand or decipher guidelines. They have to be dog simple and technically possible for me to implement them.
I look forward to what the Samurai Group comes up with, especially if it’s dog simple and technically possible to implement.
“Here. Do this tag.”
Send mail to the public-comments list, to the WAI cochairs, and to WAI head Judy Brewer. The addresses are on various Web sites (but start with email@example.com, which, in principle, should be sufficient).
Oh, good, so it wasn’t just me. I am utterly baffled by WCAG 2’s definitions and levels.
Regarding “Christophe Strobbe’s comment”:http://www.alistapart.com/comments/tohellwithwcag2?page=1#4 that “WCAG is not just for today’s standardistas”?: if anything, WCAG 2 in its current form would discourage the adoption of accessible development practices beyond the circle of those who already do it. Even under WCAG 1, the goal should be accessibility rather than compliance with a single (somewhat flawed) standard. If you are determined to achieve the goal, it is still possible to get the information you need from a multitude of sources out there, but it requires more research. WCAG 2 has made that situation worse. Also, its obscurity puts people who are mandated to create accessible sites in an awkward position.
Mr. Clark: Regarding “your remark”:http://www.alistapart.com/articles/tohellwithwcag2#WCAG-documents:levels that “[WCAG 1] Priority 3 is irrelevant and unattainable”?. Wouldn’t you say that “9.4—Create a logical tab order through links, form controls, and objects.”? is still pretty good advice?
By the way, Kevin Cornell’s horse-backwards (as opposed to ass-backwards) illustration is a pretty disturbing image!
Unbelievable. This is utterly astounding to me. Who are these people and how did they commandeer WCAG 2?
Thank you, Joe, for pointing out this atrocity to those of us who have been (until now) unaware of it. I would assume most web designers/developers (even those of us who have dedicated years to web standards) have not been actively following the development of WCAG 2. So for this article, you have our gratitude.
Wow. I am completely stunned.
Great read, thanks for posting this!
At first I thought Mr. Clark was overacting, but in browsing over the article, I began to share his feelings of disappointment and frustration.
I’d like to highlight/expand on a point made by Joe about the following:
It says that your code must produce the same DOM structure across user agents. However, assuming the same (X)HTML code, this should be the responsibility of the user agents as opposed to the Web Content Authors (I think WCAG refers to them as unambiguous web component author units?).
Take the following code snippet as example:
Most browsers assume abetween the
Isn’t this counter to accessibility? Not to mention standard good practices, maintainability, proper seperation of content and logic, etc…
So what’s the motivation for this? The only people who can benefit from this are sloppy implementors of core user agents, or technologies that sit on top of these (ie screen readers) and don’t want to have to deal with this, so they’re foisting it off on us.
This is a transparent attempt to put the onus of fixing broken or deviating (X)HTML to DOM implementations in various browsers on Web Content Authors and not where they belong: the makers of the user agents, and, to a lesser degree, the makers of the technology sitting on top of these user agents.
Most of what Mr. Clark has pointed out are pretty strong flaws in the standard, but this is to me the clearest example of the conflicted, biased, and unreasonable nature of the spec. Further, pitching this self-interested counter-standards schlock a supporting/furthering accessiblity needs (which NEED supporting and furthering) gets my dander up, since the only people this seems to support are the authors.
I hope whatever governance is in place at W3C reins these guys in quickly, or some well intentioned but uninformed legislator could make this stuff law…
I am pretty sure the results of Taylor’s example will differ between HTML and XHTML documents, because, if I recall correctly, XHTML always assumes the presence of a thead and HTML doesn’t. Or vice-versa. Can somebody explain? (AvK, where are you?)
According to the “WCAG(Web Content Accessibility Guidelines) 2.0 Guidelines”:http://www.w3.org/TR/WCAG20/complete.html#conformance-wcag1,
bq. Authors whose content currently conforms to WCAG 1.0 may wish to capitalize on past accessibility efforts when making the transition to WCAG 2.0. A qualified conformance statement could allow them this flexibility. For example, a conformance claim might include the following statement: “Materials with creation or modification dates before 31 December 2006 conform to WCAG 1.0 Level AA. Materials with creation or modification dates after 31 December 2006 conform to WCAG 2.0 Level AA.”
So if an author wishes to capitalize on past accessibility efforts but *does not* wish to make the transition to WCAG 2.0, would the conformance claim “Materials conform to WCAG 1.0 Level Double-A” allow him the flexibility to do so, or does such a creation or modification disqualify the conformance statement?
Speaking as a average web developer…
The confusion that the WCAG (1 or 2) generates is so ornerous that regular developers like me will NEVER be applying them, even if we are aware of the need. The majority of our clients don’t even have accessibility on their radar. AND if accessibility means generating three to four times the content in order to be ‘accessible” to all, they never will have it on their radar. Do you think ANY content provider other then government-based ones are going to develop alternate content for those at a lower reading level?…
What we need is “Accessibility for the real world” if you want designers to actually make their sites more accessible. CSS standards actually made our lives easier, so we went through the pain of learning them. If the W3C expects designers to use accessibility standards, there needs to be clear guidelines, and, it needs to be cost-effective for the clients.
Thank you Joe. I was wondering what was happening in the world of that particular working group. A previous job was a web developer at my old university. It was exceedingly difficult to get the various IT groups, heads of schools and uni beaurocracy to become interested in accessibility – this is despite physical access around the university considered a high priority.
I’ve been harping on about the value of web standards, semantics and accessibility for around five years to the younger developers I’ve been in contact with and it has only been in the last two years or so (even one?) that people have started taking up the cause in a big way. While accessibility might not always be at the front of their minds, there has been success in getting the foundations of these – standards and semantics – in at the start of the development process. To see that this has been totally ignored is not good. Further, as with all standards, clear and concise instructions are a key requirement. I’m yet to read the working groups work, but will with great interest.
Finally, one last comment. A designer is not a developer, but a developer can be a designer.
As someone who provides training to Web developers and designers about accessibility, I’ve followed the progression of WCAG 2.0 quite closely and with growing dismay. I’m relatively happy with the “four principles”:http://www.w3.org/TR/WCAG20/complete.html#overview-design-principles around which the guidelines are organized (though levels of perception, operability, understanding and robustness will differ widely), but the details will first confuse and then dishearten those responsible for their implementation.
I’ve always emphasized techniques that make sites accessible on a practical level, rather than just ticking WCAG 1.0 boxes and hoping for the best. But WCAG 1.0 were the starting point, not least because of their importance as a de facto standard adopted by (UK) government and the public sector.
Showing learners that there are straightforward steps they can take _today_ to improve the accessibility of their sites is rewarding. That these methods can be shown to meet the requirements of most of WCAG 1.0 is a happy outcome. Unless WCAG 2.0 can be straightened out, that outcome will disappear and we’ll find ourselves again in the situation where all but the most committed designers and developers will turn away from accessibility in droves.
For many clients the primary reason that accessibility is considered to be important is unfortunately nothing to do with providing access to sites.
There are a significant number of companies who want to avoid scenario’s such as the “Sydney Olympics court case”:http://www.tomw.net.au/2001/bat2001.html. Here in the UK such a case would involve the DDA, and from what I understand the case would most likely be centred around whether the site “complies with the WCAG guidelines”:http://www.webcredible.co.uk/user-friendly-resources/web-accessibility/uk-website-legal-requirements.shtml. There seems to be an ever increasing gap between “real” accessibility and sites that pass WCAG checkpoints (and therefore Bobby).
I commend Joe for writing such a fantastic article and I think the Samurai will help enormously to help make sites actually accessible but the WCAG 2 simply cannot be ignored.
A(nother?) view from Australia:
Gian (whom Joe references in this article) and Bruce McGuire (of IBM/SOCOG arse-kicking fame, now working for HREOC – the Human Rights and Equal Opportunities Commission) presented a few weeks ago in Sydney, roundly slating the guidelines pretty much as Joe has done here.
Gian outlined numerous examples of the corporates using politics, 5am Australian time meetings and so on, not to mention “cease and desist” legal letters and booting people off the group.
The motive suggested was that the corporates are looking to water down the guidelines as much as possible, giving them freedom to produce whatever code they like.
This makes sense for the corporates economically: if there are no clear cut, concise standards to develop to, it’s a lot harder for smaller enterprises/individuals to compete. See the browser wars and their proprietary code issues around 1999 for an example of this logic …
Another example is the new “baseline” criteria – a site only has to be as accessible as the technology allows. eg. a pure Flash site 3-5 years ago would be deemed as meeting guidelines … as Flash didn’t have any accessibility features. Not much of an incentive for Macromedia etc to develop accessible applications huh?
The semi-good news, from the sounds of it, is that there’s no guarantee that HREOC (and hopefully legal bodies in other countries) will accept WCAG 2.0 as guidelines for legal compliance. Because the language is vague and does not refer to specific technologies, it’d be pretty hard to argue one or the other in court.
Taylor Mathewson wrote:
Most browsers assume abetween the
In HTML 4.01, the tbody element is always present. However, both it’s start- and end-tags are optional. Any browser that does not imply the tbody element for the sample code you provided is broken, there is no ambiguity in that code. (In reality, this applies to any (X)HTML document served as text/html).
However, strictly speaking, for XHTML (due to XML parsing rules and the desire for compatibility with common authoring practices) the tbody element was made optional. For an XHTML document served as XML, the tbody element will not be implied—again, there is no ambiguity.
Sorry, didn’t read all the comments so this may have been mentioned already…
PAS78 ‘Guide to good practice in commissioning accessible websites’ published by the BSI in march this year is exactly what it says on the tin.
Commisioned by the Disability Rights Commison from the British Standards Institute and primarily authored by the likes of Julie Howell (RNIB) and representitives from organisations like Abilitynet, The BBC, IBM, The Cabinet Office and others.
Aimed at the people who pay for websites rather than the people that build them it draws heavily from WCAG 1 and no mention at all is made of WCAG 2.
If (as is hoped) this document is successful and organisations in the UK pick up on it and use it as the basis for there future stance on accessibility then WCAG 2 may not be needed.
PAS78s language is clear and consise and combined with WCAG1 covers any accessibility requirement that any website may have.
Yes, WCAG 1 has its flaws and needs updating, lets wait and see what the samuri offer 😉
PAS78 is available from http://www.bsi-global.com/ICT/PAS78/index.xalter
I hadn’t got around to perusing the WCAG 2.0 yet, but it’s already being touted where I work (a major university) as a panacaea for all web accessibility ills by the department reponsible for making sure we comply with such things.
I’m both happy (and obviously not so happy) to see that if falls very much short of its lofty goals, and in some cases is actually making the situation worse, especially for those of us who code to standards as a matter of course.
bq. Thus, it is important not to rely on CSS for a visual-only layout which differs from the source code or programmatically-determined reading order”¦. [In t]he following example”¦ the text appears visually in the browser in a different order than in the markup.
I think this *could have been* a good point. Specifying “CSS” suggests that the people who wrote it don’t really understand what they are talking about. Yes, it is possible to have things appear all over the place with CSS – but a much bigger problem is people who use *tables* to change the apparent order of things. This can have a much more serious impact on the accessibility of a site than someone who uses CSS to put the end-of-document navigation bar down the left of the screen.
So, I’m glad about WCAG2. I’ve watched it develop, been happy with the discussions and obviously disappointed with the document(s) but not with the outcome. What I love about the web (and my job) are the few individuals who are proactive enough to say “so WCAG2 sucks, we’ll just take a step back and get it right ourselves”.
WCAG2 should not be forgotten but embraced as part of the web evolution that went wrong – but that lead to a better standard. And it will. Think of WCAG2 as the DeLorean of web accessibility. So many promises unfullfilled.
To set the scene, I am your average garden-variety web developer. I am a simple soul, with college education, good English skills and above all, good HTML skills. I spend all day, every day, producing sites – for everything from the local dentist to the multi-million pound nation-wide high street chains.
WCAG 2 has disappointed me.
For a start, I just don’t understand it. I am not stupid, but I just don’t understand how it applies to what I build. How can I bear all that in mind when going through the stages of planning/building/testing a site?
It’s going to take months to combine that into my daily routine and to be honest I cannot see the commercial benefit. Most of the sites I build, I make them WCAG 1 level 2 accessible out of good practice and for good karma. I like it; I enjoy the sense of responsibility I get from it. It sets me apart from the monkeys knocking sites out in the back bedroom.
WCAG 2 is so difficult, why would I bother? My customers do not care! If they do, then they will have to pay me a lot to have a compliant site, as the extra amount of time involved does not come for free.
If WCAG 2 was actually simple – a simple to understand a plain-English check list (i.e. – do you use PDF, see page 3, if not continue to page 4 > checklist) along with highly automated checking system, then we are going to see a lot more developers producing compliant sites. Just dream”¦an internet with more and more compliant web sites. Is that not what we all want?
Sorry to be off topic, but I have to reply to the comment from “post 43”:http://www.alistapart.com/comments/tohellwithwcag2?page=5#45
bq. Finally, one last comment. A designer is not a developer, but a developer can be a designer.
Hogwash. Calling oneself a designer does not exclude the ability of the person to be a developer and vice versa. Design and Development are different disciplines. Any particular person may have an aptitude for one, the other, both or neither. I’ve met plenty of people in the first, second and fourth category, but very few from the third.
Joe, you have done the web a great service, and one, as you pointed out, I have been too scared to do due to the internal scare tactics of the Working Group (I think I just ensured that I never attend another 6am teleconference again – @Stalvies: the Japanese WCAG WG teleconferences were at 5am, the Australian teleconferences at 6am).
During my six years on the Working Group I have met some wonderful people fully committed to web accessibility even though it is yet another thing they must fit in with their busy schedules. I have also experienced bigotry, ignorance, authoritarianism and stand-over tactics. This behaviour would spur me on – I like a challenge – but after awhile it became too exhausting (I think the final straw was a thirteen hour teleconference).
If you want to hear my thoughts further on the subject you can listen to my podcast for the Web Standards Group in Sydney last month:
I am also more than happy to talk / interpret / argue about WCAG 2.0 – I have some insight into the Working Group reasoning behind some of these success criteria (read: checkpoints).
1. What happens if WCAG 2 becomes a legal requirement, like 1? How will anyone follow it properly?
2. I have always disliked the W3C documents. They are verbose, circular, and often fail to give examples. Compare them to the great way web languages are explained in simple steps and clear English on the W3Schools site.
Sadly I see the same mistakes being made by the WHATWG (HTML5), with endlessly long documents, which take forever to load, even on broadband, when you only want to reference a tiny part of them.
We need a new approach. Clarity and a simple structure to navigate should be the key goals. I would even go so far as to say we need a *whole new standards body*.
3. The W3C also seem painfully slow to progress. We should be on to CSS4 or 5 at least by now. Browser makers are forever holding back on things like CSS3, incase the draft changes. How is the web supposed to advance quickly?
4. Lastly, Charles Munat wrote in comment 22: _”We, the people, built the Internet.”_
Sadly this is not true. The US Military built the internet, followed by the universities. And if it weren’t for big business, the infrastructure for today’s net would never have been afforded. Think telco, fibres, satellites – everything to do with the net has probably cost billions. In this, the user is a very small pebble on the beach indeed. However, I suspect that what you meant was that users built the “content” of the web.
One part of the accessibility standards appears to require that any content intended for educated adults, or even for high school students, have an alternate version written in words of one syllable. Yet I, an educated adult and full-time website developer, an completely unable to comprehend large chunks _of the standard itself_.
Where is the version written to a level suitable for people who are not W3C members? People such as the ones in this discussion who are debating multiple possible, and apparently equally valid, interpretations of some of the requirements we are expected to follow? People such as me?
It’s even more ironic that the Web originated as a hypertext system for CERN. I would love to see some of those early documents made “accessible” according to this requirement of WCAG 2. How exactly does one go about writing a paper on nuclear physics that is understandable by someone with a “lower secondary” reading level (lower secondary in what country? in what state? in what language?) but is still useful to its target audience, namely physicists?
I’m all in favor of accessibility. I’ve been pushing it on my clients for years, with varying degrees of success. But dumbing down the entire Web to a child’s reading level in the name of a broken concept of “accessibility” is flat-out asnine. It’s taking the “lowest common denominator” idea to its most ludicrous and outrageous extreme.
I’m nobody. I’m one of those people in a back room banging out websites for small busineses and helping them compete with the big guys. You’ll never see me presenting a paper at a conference or writing high-flown tomes on the One True Way to do this or that or the other thing. I’m one of the people down in the trenches, making the Web work while the great talking-shop chatters away. For every website built by some global corporation there are ten or a hundred or a thousand built by someone like me, and even more built by the owner’s nephew who’s got a copy of FrontPage and thinks he’s a website designer now. If the standard isn’t attainable (or even comprehensible) by people like us, it’s not going to be used at all down here. So everyone loses — especially, the very people this travesty of a standards document is purporting to help.
Am I allowed to say “purporting”? Would a middle-school child understand that? Do I have to provide an accessible alternative to this comment? Would “WCAG 2 r teh suxorz!!!” do?
Could someone please explain to me how a specification that is supposed to enhance the usability of a Web site does not require the code to be valid, yet requires the code to be presented in the user agent in the order it appears in the sourcecode, while not requiring transcripts and audio versions of video to be made available to the blind and deaf (and/or other vision and/or hearing related difficulties)?
Or what about the flicker that may now be permitted with ANY part of the page? I’m sure people with epilepsy will really appreciate THAT “accessability feature” (they won’t).
There’s a lot more I want to say about this, but it delves into the realm of flames, trolls, wood nymphs and cat girls, so I’ll stop here for the sake of the other readers.
Lachlan Hunt wrote:
bq. In HTML 4.01, the tbody element is always present. However, both it’s start- and end-tags are optional. Any browser that does not imply the tbody element for the sample code you provided is broken, there is no ambiguity in that code.
Yes I agree. I think I obfuscated my point a bit, so let me re-iterate. Based upon WCAG 2.0 the “broken browser” scenario you mentioned above is now something we now have to deal with as web content authors.
F28 states in part:
bq. 1. Using at least two different user agents examine the DOM generated from the markup. The user agents must have the same capabilities available. For example, if the markup relies on scripting, make certain that both user agents have scripting enabled.
2. Determine if the DOMs are the same.
If step #2 is false, this failure condition applies and content fails the success criterion.
Unless I’m misinterpreting something, which is possible given the ambiguity of the document, this can be paraphrased as:
bq. We don’t care what the host language spec says or what most browsers do. If one of them interprets a piece of (X)HTML code differently from the spec/all the others, or has a non-compliant DOM interpretation, *we still expect matching DOM’s produced*.
In other words, consider the following scenario:
We have a valid (X)HTML page. We put it through browsers A, B, and C. These browsers are all compliant [enough] that they produce a DOM structure X according to spec.
Along comes browser D. Now browser D’s developers are nasty little buggers, and the browser is significantly non-compliant. It digests our page and produces DOM structure Y.
Our page has just failed WCAG 2.0
Now I’m about to speculate on motives, so everybody get their tinfoil hats on.
The only reason I can see in doing this is so that screen readers or similar products that sit on top of a browser no longer have to deal with deviating implementations.
WCAG 2.0 seems to foists the responsibility for compliant DOM generation on us when the responsibility should lie with browser makers.
Well, in response to Joe’s earlier post, the WAI’s website (http://www.w3.org/WAI/) does not follow WCAG 2.0 guidelines. Namely, the source-code doesn’t match up with what is on the page. One of the first elements is a skip anchor which is nowhere to be seen. THe first element I see on the page is the W3C logo in the top left.
Now, of course, if you bring up this point with any of the goons at either the WAI or the WCAG WG you’ll get the ever-so-simple scapegoat: “Oh, yeah, but the guidelines are just a draft as of yet. When they come into effect we will implement them.” Well that’s all fine and dandy but if the guidelines are so much better you’d think that they would go out of their way to at least make the FRONT PAGE of their site validate as a sort of showcase.
The fact remains, the only logical answer to them not making their main page comply is that it’s too hard and they don’t want to re-do it if the guidelines change. Understandable. Or is it? Should it be hard to move from full-on WCAG 1.0 to WCAG2.0? Realistically, no. Really, it is. Any changes in the guideline now are going to be minimal, so the not wanting to re-do the site excuse is also bogus. So why not have their main page comply? Either they are all idiots, which is clearly false, or their head doesn’t know what their … is doing. And is that really the kind of people we want shelling out guidelines?
Tayler (comment 58), a broken UA would be covered under the clause: “The user agents must have the same capabilities available.” A browser that doesn’t support HTML sufficiently doesn’t have the same capabilities as the others and probably wouldn’t meet the baseline requirements either (although no browser fully supports HTML anyway). Besides, only 2 interoperable implementations are technically required, so it wouldn’t matter that some obscure and broken browser like that failed the test.
However, comparing the DOMs in different browsers is an unrealistic expectation from authors and that test should not be included as is. Ideally, validating the document should be sufficient to meet the criteria, but authors would also have to ensure that they haven’t used technically valid, yet widely unsupported features (e.g. The SGML SHORTTAG NET features).
I haven’t read through the entire document, but reading the references you’ve provided, Joe, I can feel comfortable with what those areas say:
* *Defining pages and sites* – This is amusing, but not exactly a danger to the web community or to those using assistive technologies.
* *Checking the DOM output and proving they are identical* – “The test”:http://www.w3.org/TR/WCAG20-TECHS/#F28-procedure says that the testing browsers must have the same capabilities. If one understands only HTML while the other understands XHTML/XML as well, they don’t have the same capabilities, so running the test using those browsers is invalid. But the concept of saying “you’re good as long as you get the same output from browsers with the same features” has merit in a world where not every HTML writer is perfect.
* *You can still use tables for layout* – This _is_ disappointing. I’m with you on this one.
* *No flashing* – Makes sense to me – I have no special desire to cause seizures.
* *Defining technologies as a baseline without recourse* – Actually, “the very next section”:http://www.w3.org/TR/WCAG20/complete.html#baseline-setting says that baselines can be set by jurisdictions, such as states, counties, or countries. I believe bringing charges under a country’s disabilities laws qualifies as recourse.
* *Defining entire sections off-limits to accessibility* – Actually, all it says is that you can state that only certain sections are compliant. In other words, all of the other sections are non-compliant. If non-compliance is illegal in a jurisdiction, then scoping your site’s compliance doesn’t mean that parts of your site are legal, it means that all of the other parts are illegal.
* *Publishing a WCAG 2 compliance declaration* – I can think of several areas where accessibility declarations are more stringent. Ask any architect or general contractor who has worked on a commercial building.
* *Not requiring audio descriptions for the blind* – It appears that they have since corrected that omission, which I agree is pretty important. (ALA is definitely a good place to go to be heard!) In terms of captions for live casts, it sounds like reality intruded.
* *Remixing podcasts* – If the podcast creators are aiming for Level 3 compliance, yes. Or if governments require that everyone must meet Level 3 requirements. I guess I’m not too worried, since governments tend to try to balance the betterment of business with the betterment of humanity. As for Flickr users, Flickr will disappear the moment accessibility requirements for content appear. “Please enter WCAG-compliant alternate text for this photo…” I don’t think we have to worry much about this one.
* *Requiring skip links for tiny amounts of navigation* – There are two alternatives described in “section 2.4.1”:http://www.w3.org/TR/UNDERSTANDING-WCAG20/Overview.html#navigation-mechanisms-skip-techniques-head — using skip links, as you’ve described, or grouping your content in a way that lets you choose which content to focus upon. Saying
doesn’t sound too crazy, does it?
* *Not using offscreen techniques because everyone must see them* – As “Lachlan Hunt”:http://www.alistapart.com/comments/tohellwithwcag2?page=2#14 pointed out earlier, the specs say nothing about requiring exactly identical words, just identical meanings.
* *CSS layouts may be prohibited* – “This section”:http://www.w3.org/TR/WCAG20-TECHS/#N100C7 opens with the following unambiguous statement (emphasis added):
“This describes the failure condition that results when CSS, rather than structural markup, is used to modify the visual layout of the content, _and the modified layout changes the meaning of the content_.”
All of the failure examples, descriptions of purpose, and tests can all be read to follow this intent. Only the sentences you cherry-picked make it sound like positioning is forbidden.
If I created a layout that changed the meaning of my content, then it _shouldn’t_ be considered compliant. This says nothing that would prohibit a source-ordered, float-based layout, or any other designed layout that retains the meaning of the content.
* *Defining words, idioms and jargon* You don’t mean * gasp * using the , and tags and title attributes, do you? Or even providing a glossary? As for providing pronunciation for words that are ambiguous if you don’t provide such, this does sound important for the example provided (multiple pronunciations of the same Han character). I would be disappointed, however, to discover that Japanese assistive technologies aren’t able to read characters correctly on computer monitors.
* *Maintaining multiple versions, compliant and non-compliant* How is that any different from maintaining versions of a document in multiple languages?
Several of the issues you bring up are worrisome. But most are (from what I can see) distorted interpretations of those areas of the document. I’m not a lawyer, but I think a good one _might_ be able to make a case that perverts the document in the way you’ve made it sound like it could be perverted.
But I probably wouldn’t _need_ a good lawyer to refute it.
A lot of people are now worried because of your interpretation of the document. And I agree that there are some issues that could be addressed (such as using tables). If you want to improve accessibility and gather support for your point of view, more power to you.
But screaming “Fire!” in a theater because you don’t like the ending of the movie is not a good answer.
You may be right on all counts, Michael, though you are significantly discounting the difficulty involved in compliance in some of your points. Or you may be wrong on all or some counts. Or I might be.
Now, even if we accept that a developer like you might support the intentions of WCAG 2, are you willing to concede that the documents are so unclear and so poorly written that my interpretations are not mere “perversions”? but actually viable readings? _That_ is a problem.
Let’s imagine a scenario in which WCAG Working Group absolutely sticks to its guns (quite imaginable given the levels of pique and stubbornness at work there, at least in one cochair) and changes _none_ of its requirements whatsoever. If they at least make the requirements much harder to misinterpret or actuallly unambiguous, do you not agree that such would be an improvement?
I think it would be —Â in one way. Your claim, in essence, is that I’m blowing things out of proportion. If WCAG WG made its intentions crystal clear, we could rid ourselves of what you consider a confounding factor, my exaggerations. But then I expect there would be almost the same alarm as I have allegedly caused by my exaggerations.
In other words, if WCAG 2 were crystal clear, wouldn’t people be up in arms just as much? I think more so, actually, because then they’d realize that WCAG _really meant what it says_.
By the way:
* duelling baselines are a non-starter
* deaf people are still deaf even if they’re trying to watch live video
* multiple DOM outputs are not reliable and WAI cannot prove from testing that they are
* defining content as inaccessible is a non-starter
* maintaining two documents of any kind really is twice the work and is antithetical to accessibility of electronic documents
* you misstated the business about offscreen positioning
* and I didn’t “cherry-pick”? anything
Just as a side note, if WCAG 2 was crystal clear at least we would all be arguing over what is written and not what is potentially written. In my view that would be a lot more productive than bickering about what this or that sentence may say and what that potentially entails.
In short, compliance is a great thing. So long as you have an idea what you are complying to and if there is a verifiable way to know you are indeed complying. As it stands, the documents could be read so many different ways that the only way we will know what the document actually says, short of the WCAG WG re-writting the entire thing, is when they decide to launch some sort of validator program.
I have no inherent problem with any of the specifications, although I do think it is dumb that tables will be allowed and css will be castrated. However, who am I to say that that isn’t the right course of action for their particular needs. That being said, I think the way they went about this whole fiasco is what made the documents worthless. When you’re fielding a document for millions of people, make sure they will all read it the same way before releasing it. If not, you’re just asking for trouble.
There’s a reason why lawyers study for so long. It’s hard to write anything that is unambiguous. Specifications should be, and that this one is not clearly shows that the WG didn’t care enough or didn’t put in the effort or just wasn’t skilled enough. Or maybe the big corporations wanted it to fail. Who knows. I, for one, know I won’t be implementing something I can’t even understand clearly.
Only a little request of clarification:
11. You can’t use offscreen positioning to add labels (e.g., to forms) that only some people, like users of assistive technology, can perceive. Everybody has to see them.
But labels for form aren’t also useful not only for user with screen readers but also for people with phisical disability that need to have a more large area for activate inputs (like checkbox, etc.)?
Joe and RenÃ©, I agree with both of you. The document is very difficult to read — rather like a legal document — and therefore hard to comply with in its entirety. Because of that, it is very likely to be ignored by many people and companies, unless forced to by law.
I am responding to the tone of the original points of the article. One of my coworkers read this article and immediately said, “Oh, no, writing code to Web Standards is a doomed effort!” The number of people who have posted, concerned about the initial article’s statements about CSS positioning, about equating how the (X)HTML standards are a thing of the past, indicates that a degree of fear has been raised that, when looking at the actual WCAG 2 document, didn’t need to be raised.
I would concede that your interpretations were viable, Joe, but _only_ if I were skimming the document like a handbook, rather than reading it like a legal document. When reading it like a legal document, it is unambiguous.
Several of the points you make ignore the fact that all of the Techniques sections start with the following sentence (emphasis added):
bq. *Each numbered item* in this section represents a technique or combinations of techniques that the WCAG Working Group deems to be *sufficient to meet success criterion xxx* as long as the technologies used are in the baseline you are using.
In other words, _you don’t have to choose the most onerous technique_. Any one technique will do. The exception is when dealing with navigation, where “at least two techniques”:http://www.w3.org/TR/UNDERSTANDING-WCAG20/Overview.html#navigation-mechanisms-mult-loc-techniques-head should be used (such as having page-to-page navigation as well as a site map or table of contents).
For a technical or scientific document, you don’t have to rewrite the entire document for secondary-level readers. You can “provide a text summary”:http://www.w3.org/TR/UNDERSTANDING-WCAG20/Overview.html#meaning-supplements-techniques-head and still meet the success criteria for this section.
Most of the W3C documents I’ve read are written in a way that is meant to be precise, to have defined do’s and don’t’s. They are written with the same precision — and incomprehensibility — as laws.
None of the W3C documents are easy to read. But communities develop that help each other to understand it. People write up ways to interpret the document. Those who have read the actual “CSS 2.1”:http://www.w3.org/TR/CSS21 specification can attest to the fact that it is an exhausting document to read and put into practice. Yet communities such as “css-discuss.org”:http://www.css-discuss.org have grown to help people understand it, and books have been written explaining how CSS works, to the point that it is much more accessible to the web community than it was when it was first introduced.
Those who are willing to parse these hefty documents should be able to feel confident where they are or aren’t compliant according to it. And those who have done so, and who have the inclination and writing abilities to help others, are more than free to create communities to help. I’d hope that they are encouraged to do so.
One of the major challenges with handling accessibility is the complexity and effort required to be true to its goal. I fully accept that it takes quite a bit of effort to have a fully accessible web site. But it is nothing compared to the requirements to be fulfilled to make a commercial building accessible. There are simply so many ways that a piece of content or a building might be inaccessible.
The challenge is trying to find a compromise between making a site (web or physical) less onerous to access, and making a site less onerous to create. It’s rarely one where both sides are truly happy with the result, until the culture changes so that accessibility is not onerous, but simply an opportunity to be more creative.
In regards to providing multimedia alternatives for the deaf, I fully agree that captioning live video is the right thing to do. It’s difficult, which is why it makes sense to me that “captioning multimedia”:http://www.w3.org/TR/WCAG20/complete.html#N1053B appears as a Level 2 criteria. There will always be more suitable ways to provide content in different formats. Those who can afford to do it the “right” way and use captioning should be encouraged to do so. But those who cannot still have an alternative that meets the need of conveying that information.
Regarding the other “by the way”s, readers may compare your statements in your article, and the rebuttals in my earlier comment, and make their own conclusions.
Indeed, one may not choose solely the most onerous compliance method. Nonetheless, they are listed and available. Nothing would stop a pedant from complaining to a site author that he or she should have chosen another option, including the most onerous one.
I don’t precisely know how I read the documents (as a lawyer? what?) apart from reading the three of them front to back (save for some repetitive appendices) four full times. Every scenario I listed as a means of compliance is actually in there to the best of my reasonably advanced ability to interpret WCAG specs. Anyway, comparison with the CSS spec (another working group that can’t get its act together) is misplaced, as CSS is a user-agent specification and not something that working Web developers have to comply with. They may _use_ CSS, but they do not have to _comply_ with the CSS spec.
I am perfectly OK with your disagreeing with the emphasis of my article. That seems to be the extent of the disagreement thus far. Please tell us what you think WCAG Working Group’s course of action should be.
It’s a challenge. I see that there is an ardent need to give web developers and those using assistive technologies a sense of what a level playing field should contain; how we can help those who use the web differently from others due to circumstances beyond their control, while at the same time not stifling the web’s ability to foster creativity and commerce.
It seems to me that a large concern comes from a fear that these documents will become directly referenced in the laws of a jurisdiction or country. This would mean that anyone not complying with these _advisory_ documents could be subject to unforeseen penalties. Such a fear is well-founded, if we look at Section 508 in the U.S.A. and the laws of countries such as Australia.
I agree that a pedant could indeed complain that a site is not complying due to its non-adherance to the most stringent alternatives. Unless a jurisdiction is willing enough to shoot themselves in the economic foot to require under law the most stringent requirements, however, the likelihood of their complaints bearing fruit is fairly negligible, as long as the site adheres to at least one of the alternatives provided by WCAG 2 for each guideline. Although I don’t know much about international law, looking at the reactions in the most litigious U.S., I’d think this country isn’t alone in having penalties for frivolous lawsuits.
This doesn’t mean that everyone should simply say, “Well, if I can get away with the minimum to fulfill Level 1 of the WCAG 2 guidelines, then I don’t have to do any more than that.” If someone feels that they have the resources to go to Level 2, or even Level 3 for some areas, they can feel proud of their ability to make the web even more accessible to the degree of their resources.
So many guidelines made by the W3C are based upon the desire to put forward a best-faith effort. The HTML and CSS standards had little if any impact before WaSP and others asked the browser makers to come to the table and figure out how to work together for the betterment of their users. It then took tens, hundreds, then thousands of web developers to get together and figure out how to put them into practice, so that we now have more richly semantic and meaninful web than ever before.
The challenge is educating people who haven’t put in the effort to interpret the documents and their ramifications. Unaware of the pitfalls discovered by those early adopters, they then try to enforce blindly, without researching what it means. When these people are legislators in the government, they can cause unintended damage with their good faith.
With an almost-complete expanded document, the WCAG is now in a position to provide that interpretation. WCAG can take this time to work with those legislative bodies and members who have used the WAI and WCAG documents in the past, and to identify who might be interested in the future.
If we can help them understand the intent of the documents, and if the legislators want to use WCAG 2 in their laws, WCAG can help them understand the potential ramifications of applying too much carrot or too much stick. Perhaps an accessibility-sensitive legislature would _require_ Level 1 compliance, but _provide incentives_ for Levels 2 and 3 compliance, or make compliance dependent upon whether the site is for a commercial venture or Grandma’s photo album.
I don’t know the balance myself, but it seems that there would be no better body to provide that type of guidance than the authors, who are members of the world’s primary WWW standards board.
At the same time, we see that these documents are, by and large, difficult and exhaustive to try to understand. More often than not, incomprehensibility leads to confusion and fear. To combat this, we need to find avenues that help to explain what the document does, how it can be used, and things to consider about its use.
I earlier referred to css-discuss as the place where fledgling and experienced web developers come to ask about and understand the most basic and esoteric elements of CSS-based design. A similar forum could help those who are looking to figure out how they can meet the needs of accessibility.
We need to start exploring how the WCAG 2 guidelines can be used in real-world situations that balance the tensions of requirements and creativity. We need to be able to learn from these, and also learn how it might be abused. This can both help us for WCAG 2.1 (as I’m sure such a beast will be created 🙂 and give us the bones around which books can be written to help make the guidelines comprehensible. Again, being the authors of the document, I believe the WCAG can help start this initiative, evangelize it, and contribute to it.
Of course, this assumes that the members of the WCAG have the time and resources — and the willingness — to see that their message is delivered faithfully to the benefit of all, and that WCAG 2 will in fact become a document supported by material that everyone can understand.
Because I think we can both agree that without that interaction, without guidance and help about what it means to use it and enforce it, this document could be used to do great harm. I’d hate to think that the WCAG would want their document to be perverted in that way.
Apologies on the prior post — I just realized that in most of the places I mention “WCAG” (vs “WCAG 2”) I mean the working group, not the guidelines…. Kind of ironic for someone touting the unambiguity of the document…. 😉
Very thorough job Joe, thanks for that. I would like to compliment it further as the best such analysis available, but it is the only one I have come across! Does anyone know of any other similar attempt to digest this important document?
That said, I was discouraged how very pessimistic it is, and find myself agreeing with Michael Landis that things are not as bad as you make them out to be. The intemperate tone and minor vulgarity needlessly inhibits the circulation of the work. Granted, most people don’t labor in as puritanical environment as I.
There are “some experts who disagree(Matt May)”:http://www.bestkungfu.com/archive/date/2004/08/web-accessibility-litigation/ but I believe the impact of WCAG 1.0 is almost entirely due to its “legal adoption.(Policies Relating to Web Accessibility)”:http://www.w3.org/WAI/Policy/
One of my more cynical, but reasonably well informed, opinions is that the influence of WCAG 2.0 on statue will not be strongly correlated with how well it is written. I am therefore motivated to have some faith it will continue to improve before being released. The “opportunity for WCAG 2.0 to influence U.S. law(Refresh of Section 508 Standards Tops Board Rulemaking Plan)”:http://www.access-board.gov/news/508update.htm is high, and fortunately, “there is more time.(Extending Deadline on WCAG 2.0 Last Call Review)”:http://lists.w3.org/Archives/Public/w3c-wai-ig/2006AprJun/0083.html I strongly encourage ALA readers to share their comments with the WAI.
If I wore T-shirts out in public, I would have one emblazoned with the slogan “The intemperate tone and minor vulgarity needlessly inhibits the circulation of the work.”? Typeface? Cooper Black.
(“To Hell with x”? is an homage to other A List Apart articles, should anyone be unaware. And for “intemperate,”? read “unsparing.”?)
LOL! I not enough of an ALA regular to have picked that up, but neither are the folks I would like to refer to the article. I agree that “unsparing”? is an accurate adjective.
_I fully accept that it takes quite a bit of effort to have a fully accessible web site. But it is nothing compared to the requirements to be fulfilled to make a commercial building accessible._
First, the requirements to make a commercial building accessible are clear and well-understood. I cannot agree that the same is true of the WCAG 2 documents. But more important is the fact that building owners can put enormously more money towards making those buildings accessible than website owners, especially small ones, are willing or able to put towards website accessibility. Down here in the trenches, we’re still trying to get them to not use graphicized text all over the place, and maybe understand what ALT and TITLE attributes are.
At the level I work at, a $2000 job is big; not a lot of my clients can afford more than that, and may can barely afford a basic website. (and I work cheap) They’re small businesses struggling to make a living in a world of big-box stores and Wal-Mart supercenters. The money they pay me comes from their money to buy inventory to sell, their money to pay employees to keep the store open (if they have employees), etc., not out of billion-dollar corporate incomes.
From what I have read so far, full WCAG 2 compliance would roughly triple what I would have to charge for a website. Not only would that end up with me out of business, but it would mean that many of my clients wouldn’t be able to afford websites. Not partially accessible websites, not any websites at all.
One of my clients is an older gentleman who, for medical reasons, is unable to work. He survives on disability — and the income his online business brings him. If he were required to comply with those standards, he would have no choice but to close down his site and his business. He _is_ one of the handicapped people that this “accessibility” is supposed to help, but instead, if mandated, it will turn him from an independant businessman to a welfare recipient.
This is a feel-good standard for “advocates”; it’s a guaranteed income for makers of ultra-expensive, high-end software; it’s an easy way to bulldoze competition for the 900-pound gorillas of online business.
For the small business owner, for the freelance website designer, for the other 99% of the people who have built the Web, it’s a business death sentence.
As a novice web writer(whatever) I thought the only rules were good language and simple layout. I started to read the rules but couldn’t understand them, but then I don’t have a degree!!
Surely the biggest problem here is that this document, and working group are trying to cater for a number of audiences including content creators, designers, coders and managers. When you set off trying to address such a diverse group of individuals you inevitably end up losing focus. Its the “Jack of all trades” syndrome. Of course the biggest problem is that the working group are obviously aware that the previous WCAG has been mentioned in legal circles and linked with statute such as the DDA in the UK, as mentioned in comment 47. This has led to certain parts of this document taking on an almost legal-style ambiguity in an effort to provide a “catch all” solution to people looking to implement accessibility. Both for current platforms and mythical future technology.
Of course one of the first steps to gaining WCAG 2 approval is to provide a readable version of all content so that it can be understood by those who have a “lower secondary education level”. Maybe we should wait for that version of the WCAG 2 in the hope that its unambiguous, clearly written, consistent, understandable and concise. Although I fear the WCAG 2 compliant version of the WCAG 2 guidelines will have to be wordy to the extreme if they are to cut out the legalese and babble.
I agree with what has been said so far — and I must say, Cooper Black would make an excellent statement unto itself. 😉
I especially agree that WCAG 2 is so precise as to be unreadable, which is why it’s vital we work to understand it. The good news is that, for most sites (e-commerce sites as well), much of what the Web Standards community advocates can be directly applied to fulfill the Level 1 requirements. Of the guidelines described, those I’ve asterisked below use common Web Standards principles. Most of the others (except for 1.2) simply require using common sense. (From my uneducated point of view, it seems like 1.2 really would take quite a bit of additional effort.)
* “Guideline 1.1”:http://www.w3.org/TR/WCAG20/complete.html#text-equiv – Create useful alt and summary tags on images and tables.*
* “Guideline 1.2”:http://www.w3.org/TR/WCAG20/complete.html#media-equiv – This requires additional multimedia abilities, plus the time to do the work.
* “Guideline 1.3”:http://www.w3.org/TR/WCAG20/complete.html#content-structure-separation – Use headings to define document structure, and use semantically appropriate markup elsewhere.* When using CSS for your layouts, remember to make sure that your content makes perfect sense without any additional styling before applying the CSS.
For 1.3.2, this involves standard colorblindness considerations in design and data presentation.
* “Guideline 1.4”:http://www.w3.org/TR/WCAG20/complete.html#visual-audio-contrast – Make sure your graphics and text have enough contrast.
* “Guideline 2.1”:http://www.w3.org/TR/WCAG20/complete.html#keyboard-operation – Walk through your site with a keyboard. Add accesskey and tabindex attributes where it will make it easier to navigate the site.* This may be the most arduous of the tasks, especially in a functionally-rich site, but don’t forget that user agents already provide a degree of keyboard accessibility that can help you fulfill much of this.
* “Guideline 2.2”:http://www.w3.org/TR/WCAG20/complete.html#time-limits – This will be important for situations such as online audiovisual teaching, or other such time-based slideshows. For the typical e-commerce or informational site, this has little impact.
* “Guideline 2.3”:http://www.w3.org/TR/WCAG20/complete.html#seizure – When creating designs for clients, warn them against flashing graphics.
* “Guideline 2.4”:http://www.w3.org/TR/WCAG20/complete.html#navigation-mechanisms – Use headers to logically structure content. Include any two of the following: navigation lists on each page,* a site map, table of contents, links to other pages. If search ability is appropriate for your site, this counts, too.
* “Guideline 2.5”:http://www.w3.org/TR/WCAG20/complete.html#minimize-error – If your website permits data entry, tell your users if they’ve made a mistake, with an example of a correct entry. Mark required fields as required, in a way that can be identified with or without colored sight. If they don’t fill in a required field, tell them which required fields still need to be filled in.
* “Guideline 3.1”:http://www.w3.org/TR/WCAG20/complete.html#meaning – Define what language your pages use.
* “Guideline 3.2”:http://www.w3.org/TR/WCAG20/complete.html#consistent-behavior – Follow standard usability practices when it comes to behavior — no unexplained popups, don’t move the keyboard of field focus away from where the user is working, and so on.
* “Guideline 4.1”:http://www.w3.org/TR/WCAG20/complete.html#ensure-compat – Make sure your code validates. Running your code against an (X)HTML validator “is a perfectly acceptable way”:http://www.w3.org/TR/UNDERSTANDING-WCAG20/Overview.html#ensure-compat-parses-techniques-head to ensure that it will be compatible with future browsers.
* “Guideline 4.2”:http://www.w3.org/TR/WCAG20/complete.html#accessible-alternatives – As long as your site is compliant, this guideline is fulfilled.
Much of these concepts can be learned by reading Zeldman’s “Designing with Web Standards”:http://www.amazon.com/gp/product/0321385551/sr=8-2/qid=1148951702/ref=pd_bbs_2/103-1193458-0520663?%5Fencoding=UTF8 and 37signal’s “Defensive Design for the Web”:http://www.amazon.com/gp/product/073571410X/sr=8-1/qid=1148951890/ref=pd_bbs_1/103-1193458-0520663?%5Fencoding=UTF8 (and no, they haven’t paid me to vouch for their books!). Although I have not read it, Cederholm’s “Bulletproof Web Design”:http://www.amazon.com/gp/product/0321346939/sr=8-5/qid=1148951702/ref=pd_bbs_5/103-1193458-0520663?%5Fencoding=UTF8 also comes highly recommended.
Where it gets difficult is meeting Level 2 and Level 3. This I believe is the reason the WCAG WG describes different levels — it is fairly easy to achieve Level 1 compliance, takes some effort to achieve Level 2, and requires quite a bit of dedication to achieve Level 3. This is where I think there needs to be input into the inevitable law-making process, to make sure that legislators have a good sense of the achievable versus the insurmountable, especially at the small-business level.
I would argue that, by identifying these levels of complexity, WCAG WG made a concerted effort to make some part of its criteria achievable to those with minimal resources. If your average fee is counted in the hundreds, versus hundreds of thousands, it would probably be unwise to suggest Level 3 compliance to your clients. But Level 1 compliance shouldn’t be outside the realm of possibility.
It may, however, be outside of our comfort zones, much as tableless design was (and still is) to so many.
In terms of comparing WCAG 2.0 complexity to the complexity of meeting physical accessibility requirements, the U.S.’s ADA includes very precise dimensional and functional requirements, which when added to state, county and city ordinances, makes it very difficult to jump into creating a building that is ADA-compliant. The reason that it is easier to do so now than it was years ago is because the construction industry has been able to create rules of thumbs after years of application. We are simply looking at the front end of this, because we don’t yet have the rules of thumb.
I have an ironic story of a two-story commercial building in Southern California. Because it is less than three stories tall, it does not need (and therefore does not have) an elevator. Because the second story has multi-stall public restrooms, those restrooms must have (and therefore do have) wheelchair-accessible stalls. Go figure.
Textile apparently converted my asterisks to tags. Imagine that, wherever you see a change from plain text to *bold text* and back in comment 75, there is an asterisk identifying where Web Standards concepts can be applied.
Apologies again — I remember watching the architects in my prior job working to meet a lot of ADA-related specifications, but it ends up that it wasn’t the ADA that specifies the actual dimensions, but rather the municipalities (city, state and county).
Then why are the Section 508 regs written in plain English?
Section 508 “itself”:http://www.section508.gov/index.cfm?FuseAction=Content&ID=12 is written at the same level as the_ “guidelines”:http://www.w3.org/TR/WCAG20/complete.html#N104B9 _themselves. Each individual guideline is written in fairly plain English, such as
bq. Guideline 1.1: Provide text alternatives for all non-text content.
I guess the real difference is how the two bodies go about explaining their guidelines. “Section 508’s explanation”:http://www.access-board.gov/sec508/guide/1194.22.htm is a good read, and, as you say, is in plain English, while “WCAG 2.0’s”:http://www.w3.org/TR/UNDERSTANDING-WCAG20 is written more like a computer program. It seems that the folks who wrote the 508 guide are writers by nature — and possibly with legislative analyst backgrounds — while the folks who wrote WCAG 2.0 appear to be programmers by nature. They definitely want to provide exact criteria, versus providing examples.
If someone wants to make a more readable form of WCAG 2.0, by all means they should. It may not be normative, but then again neither is the guide to Section 508.
It looks like WAI is trying set up specific tests that can be applied, whereas the writers of Section 508 do not. Tests are much harder to write and define than simple guidelines.
Also, the only part of the WCAG 2.0 that needs to be followed are the guidelines. All of the “Understanding” and “Techniques” documentation is informative, not normative, which means that they do not have to be applied exactly, or even read, if you can read the guidelines themselves.
I have yet to encounter a plausible explanation (particularly not from the Web Standards Project) as to why a technical specification has to be unreadable. That strikes me as defending the indefensible.
I would point out that writing and editing skills are valuable and should be sought out by a technical standards committee just as technical _proficiency_ is — if it weren’t for the fact that WCAG Working Group does not qualify members on their technical proficiency, many of whom, including a cochair, demonstrably have little.
Thus we can hardly expect them to commission a plain-English rewrite (and a W3C executive told me to my face in Boston in 2005 that such a thing would never be budgeted for). This is perhaps another reason not to hold out hope for WCAG 2, though Michael Landis is certainly giving it the old college try.
Sooner or later, someone will write an article titled Â«To Hell with ‘To Hell with’ EssaysÂ», just like Eric Meyer wrote “Considered Harmful Essays Considered Harmful”:http://www.meyerweb.com/eric/comment/chech.html .
One’s esteemed colleaque (or ex-colleague?) Joe Clark writes:
bq. The Working Group was and is unreasonably fixated on automated testing (…)
I wonder if he can can find evidence for this in the WCAG documents. I quickly checked the test procedures in the 213 techniques and common failures, and found that 32 of them (or 15%) are automatable, with 19 “maybes” (8.9%) that depend on algorithms that are either under development or that I have never heard of. I found 11 other test procedures (or 5.16%) where automation depends on the technology (HTML, CSS, …), the quality of language detection algorithms etcetera.
So in my most optimistic count, roughly 29% of the test procedures can be automated. I am also involved in a project that developed test procedures for WCAG 1.0 (Priorities 1 and 2): 35 of the 145 tests (or 24%) were considered automatable. If I use the same conservative estimate for both sets of guidelines, I get 24% automation for WCAG 1.0 (Priorities 1 and 2) and 15% for WCAG 2.0. Where is the threshold beyond which one can say that the developers of a set of guidelines are fixated on automated testing? Some clarification would be appreciated.
The problem with looking at the WCAG2 documents, Christophe (and may I interrupt myself here to congratulate you on your valiant, also apparently solitary and unaided, defence of WCAG Working Group), is that you are then considering the result of an ongoing weeding-out process of many criteria that cannot be reliably machine- or human-tested, the latter being a point of weakness and inconsistency for the Working Group.
Googling the phrase “not testable” with variations of “WCAG” and “Working Group” leads to many preliminary discussions on that front. WCAG2 is the end result of those discussions.
In any event, if your numbers are true, they argue for an _increased_ number of guidelines that aren’t machine-testable. We’ve got room, don’t we?
Note that the deadline for comments on the WCAG 2.0 Last Call Working Draft has been extended until 22 June 2006. Please see the “extension notice with additional information”:http://lists.w3.org/Archives/Public/w3c-wai-ig/2006AprJun/0083.html and the “Overview of WCAG 2.0 Documents”:http://www.w3.org/WAI/intro/wcag20 .
WAI encourages you to share your comments with the WCAG Working Group. The Working Group reviews and responds to all comments submitted on the Last Call Working Draft, per W3C process.
Over at Juicy Studio (www.juicystudio.com) a Member of the W3C Web Content Accessibility Guidelines Working Group is lodging a formal complaint about the lack of success criteria (read: checkpoints) addressing people with cognitive disabilities. I am amongst a number of people who are cosigning this complaint.
As Joe attested, one of the biggest problems I have with WCAG 2.0 is that it doesn’t address cognitive disabilities. What is ironic is that WCAG 1.0; lambasted for not having enough of these checkpoints, actually includes more checkpoints to assist these people than WCAG 2.0. I did a quick review of the WCAG 1.0 -> WCAG 2.0 mapping document to prove it.
The one cognitive disability related checkpoint in Level 1 of WCAG 1.0 doesn’t map to anything in WCAG 2.0
Of the eleven cognitive disability related checkpoints in Level 2 of WCAG 1.0:
– two checkpoints map to Level A in WCAG 2.0;
– five checkpoints map to Level AA in WCAG 2.0;
– two checkpoints map to Level AAA in WCAG 2.0; and
– two checkpoints don’t map to anything in WCAG 2.0
Now, we need to take into account that most useful of the cognitive disability related checkpoints in WCAG 1.0 were in Level 3…
Of the nine cognitive disability related checkpoints in Level 3 of WCAG 1.0:
– one checkpoint maps to Level AAA; and
– eight checkpoints don’t map to anything in WCAG 2.0
In total that’s eleven checkpoints out of 21 that are no longer in WCAG 2.0.
I have read this article a couple of times now, and I just get more nervous everytime.
I just got done with my trusty highlighter going through the document which sets out how WCAG 1 compares to WCAG 2. There’s just so much I can’t see ever being implemented, and I think the fact that so much of the guidelines (especially at triple-A) seem like some magic level most will never reach will put people off, especially those working comercially, as time and resources are limited. It really seems like a step backwards.
My hope is that Samurai, or some other group out there, comes up with a more viable, thought out set of guidelines that actually ASK developers, and users (both able bodied, disabled or just fussy) what they want from web content.
Democracy in action…
Congrats to everyone who posted – what ever you think of the 2.0s the review period needed a couple of weeks
Who honestly lives, day in an day out, by the WCAG rules? They’re just plain silly, and some are too obvious. And who in the world’s going to waste their time reading through several hundred pages of hard-to-understand crud?
This serves very nicely to underscore the uselessness of standards organizations who think themselves above real-world practice and put forth snobbish documentation. I for one hope WCAG 2.0 gets largely ignored, and people go on designing good websites based on established industry practices and user/community feedback.
Surely the point of WCAG *SHOULD* be to make web content – eg websites – easier to use for disabled people.
Even if it was superbly written and useful I doubt many people are going to want to read 500 pages of guidelines. And as it seems to be written in undergrad-legalese that’s going to make even less people want to read it.
I would have thought it was fairly obvious that an unread document is uselss – regardless of any other criticisms of it.
I agree with Joe 100% – and I have to say I support the idea of the Accessibility Samuri.
Great article, but there are a couple of conclusions which don’t seem to follow from the links you provided.
bq. 11. You can’t use offscreen positioning to add labels (e.g., to forms) that only some people, like users of assistive technology, can perceive. Everybody has to see them.
From the document:
bq. …ensure that information and relationships that are implied by visual or auditory formatting are preserved when the presentation format changes…
This doesn’t seem to mean that labels must be visible to all users; rather, it talks about implied relationships, which could be through visual positioning, context, labels or other cues. By this standard, moving a label off-screen is often legitimate as the _only_ way of preserving an implied visual order or structure which is also provided for non-visual or non-CSS browsers.
Someone has already commented on point 12; the linked section specifies that CSS should not be used to provide structural information which could be provided in the markup. To my untrained eyes, that means you should present content in the most appropriate order to establish semantics. Specifically, this condition is failed when:
bq. …the modified layout changes the meaning of the content.
Placing a navigation list at the bottom of your source and using CSS to position it at the top or side of your page would not change the semantics.
Your tenth point appears to indicate a particularly nasty omission. However, the immediately succeeding recommendation seems to address the issue of unwieldy navigation (and, incidentally, suggests that basic navigation should be accessible from any page). It is notable that this guideline is actually counted as a level 2 recommendation, while “skip navigation” links are level 1, which strikes me as odd.
I might be missing something, and if so please correct me! Point well taken that you are exposing vagueness in addition to actual flaws, but these _specific_ sections seemed quite clear to me.
If it were superbly written, it would probably be a lot less than 500 pages!
Unfortunately, as Joe points out, this document doesn’t address standards-compliant web developers. People who build web pages don’t need an exposition about “web units,” because in this context it already has a perfectly established meaning. Personally, if I ever feel a burning need to follow the WCAG(Web Content Accessibility Guidelines) 2 standard, I’ll wait until someone writes a document targetted at my audience, because this draft is certainly not concise, elegant or aimed at developers like myself.
I have given up trying to understand the section about positioning of elements. I think my interpretation is _more_ plausible but am not willing to stake an unencoded ampersand on it. I will advance the opinion that WCAG Working Group only half-understands CSS layouts and actually would kind of like to make them illegal, or at least the nice-looking ones. That is, however, merely an opinion.
We read attentively your article and thought it will a good idea to start a real WCAG 2.0 discussion in french community. Your article is the perfect starting point for that but unfortunatly he was in english only, so we decided to translate it like Alistapart copyright permit it, to make it easy to understand for french people.
This work was published yesterday on our three blogs :
with two presentation : The first is the translation alone, we tried our best efforts to be sure to translate your meaning and style. The second is the same text with our notes without alter your meaning, we thought it’s more clear to put them with the context of translation to avoid to be misunderstood and be incentive to start a good discussion.
And, finaly a classical comment’s page is also available for our visitors to comment our notes and the wcag 2 process generaly. If there is some comment about your article, we will try to translate it and post it regulary here.
here it is our essentials notes on your article :
We think that WCAG 2.0 is an improvment related to the WCAG 1 restrictions, especially about new technologies and web usages even your’e right when you wrote that WCAG 2.0 break the usual WCAG 1.0 practice and that this version of WCAG 2.0 can’t be a legislative reference. The whole document is to obscur to be a real efficient tool and need some rewrite, improvment and application documentation.
Your notes about particual points are a good demonstration for this requirements and some others are a bit more discutable.
1) the notion a page and site are now becoming more and more versatile, is there realy some page in ajax application for exemple or a RSS syndication web application is realy a site ?
2) Even html validation will not a requirement, it wil, in same time, the clear and simple way to be sure to have an “unambigous” structure
and it’s clearly written in the how to succed document with the point 2. Validating Web units. In the other hand we think it was better if they clearely write it on the guideline even in the level 3 and there DOM validation method is a nightmare for the common user.
3)We didn’t read nothing about an authorization to uses layout tables, the reference explain that it will not a failure. The succes criterion 1.3 is a clear requirement to uses layout CSS techniques and the guideline 1.3.3 a garanty of not allowing the bad nested tables
5) We consider that the “baseline” concept is a good way to determinated an operationnal area from a web site related to the intention and we are conscient that it can be a ubject of interpretation and that it will be a little complicated to manage it. But its a solution out of WCAG 1 area and it was a strong problem for our practice, our clients and finaly for all the web development.
6) in theory this was a good idea, you can exclude a part of your site, you must clearly say it and you can’t include the non accessible element on an accessible part of your site. Unfortunalty, the document is unclear they say too that you cannot exclude a particular type of content and in the exemple they exlude only the video. it’s the perfect demonstration that this point is a open door to everything.
9) the podcast are not multimedia but there are non textl content so the WCAG say clearly” For non-text content that presents information, such as charts, diagrams, audio recordings, pictures, and animations, text alternatives can make the same information available in a form that can be rendered through any modality (for example, visual, auditory or tactile). Short and long text alternatives can be used as needed to convey the information in the non-text content. Note that pre-recorded audio-only and pre-recorded video-only files are covered here”. So the text alternative is required. For the slideshow you are right, bad for flicker but user need to have an alternative for all the picture, the question is synchronised or not?
12) We think you misunderstood the subject of this requirement. The absolute properties isn’t forbidden, never, but it is forbidden to uses it rather than structural markup when the modified layout changes the meaning of the content
14)The alternate version solution is not a good way, we all know that, but i prefere to suggest at owner of a full flash website to make and alternate version and update flash to make it keyboard usable than forget accessibility and forget Flash because he will forget me.
Here it is the essentials quotes on our notes.
With our best regards A.Levy, J-P Villain, M.Brunel
I think Joe Clark is right. WCAG2 in the core issues is a step backwards not forwards.
I’m merely a webmaster, although an unashamed standards and web accessibility advocate but, I had vague feelings that something went wrong in the last few years of work on the WCAG when two friends of mine, one of whom was blind, mailed me and asked me to submit comments. They said I had some experience in the trenches and would bring some needed reality to the ivory tower.
After finally dragging the rest of the Web kicking and screaming to a world where CSS is finally an accepted technology and people actually care about good ALT and TITLE values, WCAG2 tells us to stop worrying about it?!
Perhaps the reason people are reacting so strongly to this article can be explained by the assumption that the WCAG2 documents can’t be made sense of. This article is one of the few articles which interpret the guidelines in such a way informed developers can actually have an opinion about it.
If there was a ‘WCAG2 web interpretation reference’ which would take the WCAG2 and interpret EACH PART into something *assessable* it would be much more approachable to developers..
This however can only be build on a WCAG2 that can be assessed itself, even with optional incomprehensible language included. That doesn’t seem to be the case according to this article and I won’t be able to find out otherwise. How can we use something if we can’t assess it properly.
As with a lot of standards, the gap between theory and practice has to be narrowed by us web-pioneers. The ones who actualy have to build the stuff. We’ve got so far now with Markup, Stylesheets and are on the right path with behavior. Defacto standards, pragmatic interpretations and best practices are the way to go. We can and should take command by reversing the roles played in this poor accessibility-soap. WASP-accessibility group could participate in this. Let me tickle you with some of my thoughts. Take a peek at http://blog.webbforce.nl
It seems to me that the people who are putting this together are either bad web designers covering their own backsides or have been “persuaded” by large companies who can’t be bothered to comply. There’s certainly something rotten about the whole thing.
W0w! A great (and exhausting) post – approval by committee seems to fail again.
Joe Clark’s article was an eye-opener. It raised critical awareness for the last-call W3C working draft, which lead to the extension of the comments period. Still the degree of concern and fear didn’t need to be raised. He exaggerated many issues, distorted them by omission, or in some cases “he’s plain wrong (Article “To Hell with Joe Clark”?)”:http://learningtheworld.eu/2006/to-hell-with-joe-clark/.
Got something to say?
We have turned off comments, but you can see what folks had to say before we did so.
More from ALA
Personalization Pyramid: A Framework for Designing with User Data
Mobile-First CSS: Is It Time for a Rethink?
Designers, (Re)define Success First
Breaking Out of the Box
How to Sell UX Research with Two Simple Questions