A few years ago, Joe Clark famously wrote the following:
If your site has valid code or something trivially close to same, you are working with, and within, Web standards.
If you serve up tag soup or any document with myriad validation errors, you are merely using CSS layout….The matter is now settled.
Almost exactly one year later, Doug Bowman had a different take (emphasis mine):
We don’t point out validation errors on public redesigns anymore. We know a valid site is such a tiny part of any overall measure of success. Validation is something I only do on my own work now.
Here we have two well-known standardistas, both of whom have done (and will do) more for the adoption of standards than this author ever will. Yet both have different takes on what role validation plays in designing for the web. In fact, they perfectly represent the division that exists between standards advocates today. You probably find yourself taking one of two positions on validation:
- You take a hardline stance, rightly stating that if we fail to follow the conventions of a language, then we’ve produced something altogether different and, well, invalid.
- You take a pragmatic view, rightly stating that the invalid code generated by broken tools and third-party code shouldn’t negate one’s overall commitment to web standards.
So if both views are right, where does that leave us?
The problem at hand#section2
We can all agree that the realities of the web make it hard to build a standards-compliant site. Once the client’s CMS, outdated WYSIWYG editors, and third-party advertising code have finished with once-valid markup, things begin to look ever-so-ugly under the hood; this leads many to suggest, like Bowman, that an insistence on validation is at odds with commercial web design. Given that most of these invalid sites look fine in a browser, the amount of time and money required to produce perfectly valid final code seems not only prohibitive, but pointless.
Valid markup has become equated with two things nobody wants: impracticality and implausibility.
Refining the message#section3
If it weren’t for the early days of standards advocacy, for sites like the CSS Zen Garden, Wired News, or Fast Company, we wouldn’t be as far along as we currently are; heck, I’d probably still be a self-hating spacer.gif
slinger. Despite those successes, our fractured take on validation stems partly from the wonderful evangelism that got us here.
Whenever I conduct a training session, I poll the room to see why the audience uses or plans to use web standards. The responses typically read like a doctrine that my generation of web designers have been raised on. Namely, that building with web standards can…
- shorten development cycles, as we no longer have to slog through through six layers of nested tables to build site templates.
- lower maintenance costs, as the CSS Zen Garden showed us.
- decrease page weight, which in turn reduces page load times and dramatically lowers bandwidth costs (we’ve Mike Davidson’s excellent <!-- 403 MAR2013 -->ESPN.com interview<!-- --> to thank for those metrics).
These are, I think, the “sexier” benefits of web standards, the bulletpoints we’d use to sell prospective clients on CSS-/XHTML-driven designs. And with good reason: these are all excellent, compelling points. No sales pitch should leave home without ’em.
Noticeably absent from the list is any mention of why we should adopt standards, or what that process actually entails. I mean, I’m sure we can list benefits of producing valid code, such as:
- A proven increase in a site’s accessibility,
- The promise of device independence,
- The presence of a metric against which an individual or a team’s production can be measured, and
- The knowledge that your site is future-proof, displaying in any standards-compliant browser yet to be invented.
But the sum total of those points doesn’t exactly scream “compelling business case.” When you’re speaking to a mid-level marketing executive about standards, which would you rather lead with: saving terabytes of bandwidth, or investing in device independence?
Yeah. That’s what I did, too.
Yet while the benefits of valid code may not be glamorous, we can—and should—talk about them. Validation isn’t an end result or a final deliverable; it’s an ongoing process that continues long after a site launches. If we don’t put the proper tools and commitment in place, our work will start looking like a late ‘90s throwback, and if we don’t provide guidance and education on validation, the polished, perfect pages we produced will be snapped into software that’ll produce tag soup in seconds flat.
So how can we speak about validation in a way that’s compelling to our clients?
The hidden cost#section4
Validation might not have been the sexiest selling point for standards, but it does have very real fiscal benefits. In the past couple years of running my own practice, I’ve become slightly obsessive about tracking my time, especially when it’s spent dealing with bug reports. When an issue comes in, I note the error and the account, and start the timer. Once it’s resolved, I note the cause, stop the timer, and move on.
Toward the end of my first year in business, I noticed that more and more of my time was spent working around invalid code. Layout issues that would have been trivial to fix in a valid, error-free template would take significantly longer to debug in a live page that had a few hundred validation errors. It was a matter of figuring out which parts of the page weren’t causing the errors, so I could focus on fixing the problematic section. But when the page’s markup has three or four hundred validation errors, this process quickly becomes a time sink. A necessary one, but a sink nonetheless.
So by year’s end, I found that approximately fifteen percent of my time was spent mired in invalid code. As an independent designer/developer/something, I’m grateful for all the work my clients send me. Still, what if I was a salaried employee? If IT departments conducted a similar audit, I’m confident they’d find similar numbers. And this kind of auditing needs to happen. Invalid sites may look the same as those built on a foundation of valid, well-formed code, but in my experience, they invariably cost more to maintain. This is the silent weight of invalid code, a hidden cost we don’t discuss nearly enough.
Web two-point-next#section5
None of this changes the here and now. To be honest, the pragmatists are right: that for the most part, validation and commercial web design are polar opposites. But the tools are evolving to the point where we can begin moving beyond validation as a roadblock, and CMSes like WordPress and Slashcode are dedicated to producing standards-compliant code; visual editors such as Dreamweaver and (more recently) Microsoft Expression Web almost stubbornly refuse to produce invalid markup. So where do we go from here?
Pitch process, not code#section6
In recent months, I’ve been relearning how to sell standards. I still touch on the exciting bits (the lighter pages, the lower maintenance costs, and so on), but I don’t shy away from selling validation’s role in unlocking the real savings of web standards. And it’s been an easier sell than I’ve thought: once you’ve shown a client how standards can improve their sites’ accessibility, keep it future-proof and device independent, and lower maintenance costs, they’re usually ready to listen.
And that’s where the real conversation begins. By considering your client’s production workflow and the software that supports it, you and your client will be better able to identify what could break your joint commitment to standards—and as a result, they’ll be better able to fix these issues themselves.
Shop smart: shop standards#section7
Companies like Adobe and Microsoft have recognized the growing market for standards compliance, and openly tout their products’ W3C-friendliness in the sales material. But despite that silver lining, most CMS tools and online advertising companies are spewing out code that would make Netscape 3 proud.
This is where the lone consumer can move mountains. When meeting with a prospective vendor, our clients need to ask if the product is standards-compliant, much as they might ask if an ad serving solution provides targeting information, or if a CMS is J2EE compliant. Standards should be an equally weighted part of any decision-making process—and if we remind our clients of the financial benefits of validation, it will be.
Same sandbox, same struggles#section8
But in all honesty, the real work begins with us. Regardless of whether we find validation impractical or imperative, the infighting in the standards community is the biggest obstacle to real progress. Instead of trying to understand what factors make both sides agitated, we’ve vilified the people on the other side of the argument. We need to identify what’s making 100% validation so expensive and difficult, and work on removing those factors.
As our contribution to that effort, we’ll be discussing common validation killers and ways around them in an upcoming A List Apart article. You can contribute by using this article’s forum to bring up common obstacles to validation and the workarounds or process changes you’ve used to get past them.
Samuel Johnson once said, “Where there is no difficulty there is no praise.” Personally, I think that Sam would’ve sung a different tune three minutes into debugging his first CSS layout, but the man has a point: we can’t fall prey to complacency.
In a perfect world, clumsy software and bad workflows wouldn’t break our code, and validation would just happen. But until I also get that magical flying pony I asked for, we’ve got some work to do. After all, true standards compliance is only as impractical or implausible as we make it. Given how far we’ve come in the past few years, this next challenge seems like a trivial one indeed.
Let’s get to work.
One problem I am faced with a lot is that there are no good tools that I can give to my clients that don’t end-up creating valid tag soup.(1000 nested tables is still valid code but not fun if you want to maintain it and wastes bandwidth.) So far the only thing I have found that creates good semantic and valid code is a text editor and a person that knows what they are doing.
Validation is a great development and testing tool to make sure that your syntax is correct but it can’t fix your semantics. Your semantic markup is where your true benefits come in to play: Accessibility, lower bandwidth, and easier maintenance.
My hat is off to those who are promoting standards and trying to get them implemented. Sometimes I think there couldn’t be a more frustrating job in the word than front-end developer but then I remember you guys.
Good article, in an uphill battle like this it is always good to be reminded why we do what we do.
Search Engine Optimization is the easiest way to sell web standards. All the other benefits from web standards can be dismissed, but the benefits from valid semantic code for SEO is possibility not easily passed.
When talking about benefits of good web standards to clients they usually lean forward when they hear the benefits for SEO and usually start asking questions right away. Explaining other benefits such as accessibility is so much easier after that.
“interesting article about Validation and the top 50 sites “:http://www.thesleepingpoint.com/article.php?id=8
I´m validating most of the pages that I create. It´s not only to be ready for the future, except the non compatibility problems of browsers to webstandard. It is also important that you can nearly be sure that bots of searchengines can read the page.
While this is not a selling point to clients its also important to remember that without valid code there is no guarentee that any javaScripts traversing and accessing the DOM will function as desired!
bq. Search Engine Optimization is the easiest way to sell web standards.
Thanks for weighing in, “Lasse”:http://alistapart.com/comments/whereourstandardswentwrong?page=1#2 . I agree that SEO _can_ be a compelling selling point, but there has been “some debate”:http://meyerweb.com/eric/thoughts/2004/08/18/ses-san-jose-corrections/ regarding its relationship to web standards. And I think the traditional argument relies more on maintaining a good code-to-content ratio, rather than relying on valid markup/CSS.
But more importantly, I think it lapses into some of the same errors as the other “sexy” points: it’s an end-run around establishing a real business case for the proper care and feeding of valid code. After all, selling CSS isn’t the problem these days; selling standards is.
bq. While this is not a selling point to clients its also important to remember that without valid code there is no guarentee that any javaScripts traversing and accessing the DOM will function as desired!
I think that’s true in theory, “Ross”:http://alistapart.com/comments/whereourstandardswentwrong?page=1#5 , but I’ve personally found that my DOM scripts are surprisingly resilient to invalid pages. Have you found otherwise?
One of my favorite ALA articles to date.
ASP .Net seems to be the biggest problem we have. Admittedly it is getting better, but it is frequently frustrating to see a nice clean page suddenly filled with inline javascript and invalid id attributes.
David: I don’t have much experience with ASP.NET, but it looks as though it’s possible to configure it to produce valid XHTML [ “1”:http://msdn2.microsoft.com/en-us/library/ms178159.aspx , “2”:http://aspnetresources.com/articles/HttpFilters.aspx ]. Perhaps some more back-end folks could weigh in, whether it’s .NET or another technology.
Great article. In response I’d like to say that with a proper plan, and a basic understanding of design, a site can be created to standards very easily.
Tools are not our enemies. I work within Dreamweaver 10-12 hours a day, and it is quite easy for me to develop a site that conforms with valid code, doesn’t have to use tables, and can be very device independant.
The key in my opinion is to put in the time and effort to plan a site out, before simply jumping in and throwing content around. It isn’t hard, but it does take a solid plan and quality organization.
http://www.thesleepingpoint.com/article.php?id=8
I’m with this guy. Web standards are great, and I follow them without hesitation, but validation remains an uneccessary step for most designers.
My code validates without the effort. If I miss a detail and it doesn’t validate, my usual reaction is a big shrug. One slip in validation doesn’t make or break a site from any sane situation.
I agree with Alex that validation is often an unnecessary step. Once you are in the habit of writing standards-based code, it’s rare that you need to bother checking a validator at all.
My biggest hurdle with standards isn’t semantic markup, clear and scalable CSS, or even embedding Flash content. It is dealing with major web browser standards inconsistencies (e.g., mime-types like application/xhtml+xml).
What steps are others taking to stay true to the strictest of standards when it causes your page to simply not display in Internet Explorer?
It’s a real pain to work in a large company and be the only standards advocate. I go back and forth with the other designers and the IS people. I’m winning the other designers over, but they all come from print backgrounds, so it’s still a foreign language to them. Our IS department refuses to cooperate. We are in the process of building a new e-commerce engine and I’ll be responsible for creating some controls for it, for sorting product, etc. The code that this thing generates is horrid. Non-smeantic markup, nested nested nested nested tables, duplicate ID’s, and pages missing doctypes run rampant.
My question is, how do you convince a group of people, who already _know_ the benefits of standards, that adopting standards is in the best interest of our team and the company?
I agree with the article. Sometimes I let validation slip (usually in my CSS and not my XHTML though) for the sake of purpose. For instance, I have used alpha transparencies on some image rollovers because it saves the work in creating two images, plus it saves bandwidth. It adds more work to the user agent, but that makes sense. Since these CSS properties aren’t compliant, but _do_ work without breaking browser or other device functionality, I see no reason to avoid them in many cases. We have to remember the *_reason_* for standards and validation, and not just focus on the letter of the law.
The realities of CMS’s and third party programs make perfect scrict validation impossible a lot of the time, but this doesn’t mean we should stop striving for it. Whenever I get the chance to replace an old invalid program with a new one, I do.
And as far as selling clients on CSS layouts and standards, if the tools allow it, its a non issue. Its just how we do things now. A client will almost never specifically ask for a nested table based design.
My first reaction to this article is the initial quotes. Are Joe Clark and Doug Bowman really saying different things? Joe talks about “valid code *or something trivially close to same*”. By which I assume he won’t cast you out for an unencoded ampersand. Doug Bowman says “*Validation is something I do … on my own work*”. So he’s hardly saying it doesn’t matter.
Am I missing something here? Not going around crowing over minor errors in others’ code – which is what I think Doug is saying – is different from not thinking validation is important. It’s about having some respect for other people, and recognising that errors in a published site may have no relevance when judging the abilities of its original designer/developer.
My opinion has always been that validation is essential to web standards, but that web standards not just about validation. Nick Morgan nailed it for me in the “very first comment”:?page=1#1 :
bq. Validation is a great development and testing tool to make sure that your syntax is correct but it can’t fix your semantics. Your semantic markup is where your true benefits come in to play: Accessibility, lower bandwidth, and easier maintenance.
We all know you can write code that validates but is a million miles from what Web Standards are supposed to be about. Tkae a moment to picture the table-layout, html4.0 Transitional nightmare right now…
… [insert your picture here] …
Yes, we need publishing tools which don’t produce invalid code. But increasingly we have those tools. The problem is they don’t help people create valid *semantic* code. I can give a client a standards-compliant web editor. It’s more difficult to stop them creating ‘headings’ by highlighting a line of text and hitting the *BOLD* button. I’m not sure this is something tools can fix.
The benefits of validation can be used to “sell” Web Standards but it’s not the solution to web standards being a hard sell. Web standards have many benefits, but no single “killer” benefit. For some people, one or other nails it – it might be SEO, accessibility, page weight, flexibility. In some sectors where regulation and standards are common, validation and the idea that there is an objective standard against which a website can be “tested” is attractive.
The problem is that any individual benefit can be dismissed by the standards refusenik. Harping on about validation isn’t going to fix that. I don’t have the answer to converting the standards-refuseniks. I wish I did.
I fully agree with Sophie. As long as a site’s editor doesn’t know the difference between a heading and bold, a bit larger text, we won’t get really accessible sites—independent from the validation result.
Writing valid code is something one can learn like a foreign language or mathematical formulas. But writing semantic code needs a deeper understanding of the content and the different ways the content can be accessed through web clients or search engine bots.
So let’s train the editiors and give them tools to write the semantic code in a valid way.
http://www.thesleepingpoint.com/article.php?id=10
The Sleeping Point put out a response to this and another article on standards- another interesting read.
To amplify Ethan’s point:
The opportunity cost of creating valid markup is low, under certain feasibly-achieved circumstances. Fifteen percent of anyone’s time (that spent picking up stray strands of spaghetti) is a lot.
What are the other circumstances?
* Validity directly causes lossage (e.g. xml+xhtml, embed vs. object).
* Third party legacy apps
* Lack of developer education
Of these, only the first serves as an excuse IMO. If you’re producing your markup sensibly, you don’t need tools that write markup for you.
As for the legacy apps, shouldn’t it be possible to encourage migration to better tools on the grounds that the better tools are “with it” as opposed to “outdated” or “obsolete” on account of the quality of their output?
The third neighborhood, that of the clueless co-dev, is the most painful. They’re on the team because of good fit, and technical brilliance doesn’t come into the picture at all. These people need to be marginalized if possible, otherwise picked-up-after.
Smart people understand quickly why valid markup is a good thing, and even they will agree that their excuses for continuing to rely on tag soup and related flavors of slop are, in fact, nothing more than _excuses_.
At the end of the billing period, however, the sponsor’s the one making the rules. As long as they understand the consequences of doing things half-assed, why not just give ’em what they want? If you’re nice the first time around and you have the good grace not to say “see I told you so,” then maybe they’ll hire you to do the revamp when the time comes.
…And it will come.
I try and make validation an essential part of my workflow so i validate as i am creating. I still end up with errors at the end, but they are fewer and easier to fix. I make sure my pages validate before i start fixing errors, as sometimes after i remove the bugs, some errors dissapear so its my first point of call. Validation is an important web standards tool use it!!!
Proclaiming CMS applications and WYSIWYG editors make it impossible to create a valid site is a cop out!
Using bad CMS applications and WYSIWYG editors makes it impossible. The answer – use good ones.
“WordPress”:http://wordpress.org/ is a CMS that will always be well behaved with your perfectly valid code – and if you ditch the lackluster “TinyMCE”:http://tinymce.moxiecode.com/ for the brilliant “XStandard”:http://www.xstandard.com/, you’re onto a winner.
The only hurdle you need to get over then is training your business users that “Strong Emphasis” does not a heading make!
Oh yeah – and your crappy ad code 🙁
I think bandwidth is a huge seller. After I demonstrated to the senior execs in my company that their OLD website (done before my time) took over 100 seconds to load on a 56kbps connection, and that x% of their users were on dial-up (probably investors on AOL), THAT made some instant converts.
My biggest gripe now, is that the company that manages our investor relations site uses a closed proprietary system to generate html, and it’s horrible. Any company that provides a service to other companies (be it Investor Relations, Shopping carts, search results, blogs, etc) should know the added value of provided standards as part of their solution.
I’ve recently been exposed to a substantial open-source app. It works well enough, but the HTML is possibly the worst I’ve ever seen, sometimes racking up over 3,000 validation errors on a single page, including such gems as preceding the HTML tag with some Javascript…
I asked the developers about it, and the response was that it had to be that way to work in IE, and it was far too difficult to make it work in other browsers. Some of doesn’t work in IE anyway, but I guess they will continue stacking worse HTML on top of bad stuff until it disintegrates, rather than fixing the core problem.
Not surprisingly the underlying PHP was of similar quality – spewing out hundreds of notice errors. When I reported them as bugs, they were all dismissed as bogus, saying that they were “not interested in fixing notice errors, only real bugs”.
I’m often incredulous that some developers even attempt to make wildly invalid code work across browsers – it’s just such a bad place to start from. To date, I’ve not convinced a single one of the error of their ways, and yet they still have jobs…
I think sometimes people confuse valid markup with CSS layouts.
Nested tables, font tags, unquoted attributes, unsemantic markup and bold tags instead of proper headings…all of these can validate.
As usual, “Ben”:http://alistapart.com/comments/whereourstandardswentwrong?page=2#18 is right on the money. Discussing validation intelligently with our clients and colleagues provides them with valuable information they need to make other critical decisions. The case for selling standards has been out there for a few years now, but it’s incomplete. We need to be more vocal about what the “care and feeding” of web standards entails, and that means bandying the V-word about.
Having been a programmer through project manager over the last 23 years (from programming COBOL & PL/1 on MVS mainframes, through C on Linux, LANSA and RPG3 on AS400s, and team leading/mamanging projects of all sorts on all platforms) I am still amazed that this fight is still being fought, or indeed had to be fought at all.
As an old mainframe programmer I am gobsmacked that coders of web stuff are not forced to sit through a structured walkthrough (SW) of their code before setting it live. Nothing gets presented unless ALL compile errors have been eliminated (for compile read validation or PHP errors) except for a very small subset of allowed errors & the reasn for leaving them is documented in the code
I’ve heard all the arguments about the web being ‘different’ (and so was Client Server, and RAD to name but a few trends I’ve seen in my time) & had a flaming row with my last (web) development team leader (I won – he was fired) as coding to published standards (whether branch, enterprise, customer or W3C) & conducting SW’s had a long history & was part of the culture in our firm – it was a holdover of our mainframe days & they usually resulted in tight , standard, and maintainable code with few errors, and usually delivered just a little bit late.
We had some projects with ‘the web is different’ PM’s that coded according to anarchy that is the web , they were delivered very very late, unmaintainable spaghetti, and bloated – when they dropped into crisis – an old hand (usually with mainframe background) went in and the 1st thing he or she did was stop all development & start running walkthroughs to establish a reasonable quality baseline & only when all was up to standard, was development continued. In the long run it saved time – although senior management usually went ape to discover NO progress being made for a little while – they were certainly convinced when presented with examples of the c***p code that had been produced to date.
regards
Kim
PS – In case you are wondering my job went to India 🙁
For those that are singing the praises of “The Sleeping Point”:http://www.thesleepingpoint.com/article.php?id=8 – what do you make of these snippets?
bq. Reload that page in every browser known to man- they know exactly what to do with an
tag that doesn’t close.
Here’s someone who doesn’t understand XHTML. A compliant XHTML browser, being served application/xml+xhtml, *will* stop when it is served invalid code.
bq. And the alt tag really isn’t necessary for every damn image on the page.
If you care about people who may be using screen-readers, yes it is. You wouldn’t leave the word [IMAGE] visible on the screen, so why leave it there for those people who use non-visual agents? Simply not acceptable.
I’m always amused to see that a majority of the people who rabidly push for “validation”, “standards”, “accessibility” and all other current fads are lacking a formal education in graphic design or programming.
The irony is that, in pushing for a higher level of “purity”, we might be destroying the web’s most powerful trait: its democratic nature.
Think about these questions:
1. How much smaller the web would be if browsers just stopped displaying pages with errors in them?
2. How many “standard” advocates would have to get a job serving burgers if web-related development/design/work was a regulated profession, like medicine or law?
3. Wouldn’t regulation (as in “needing a license or college diploma”) immediately result in better coding practices across the board?
4. Wouldn’t regulation provide clients with greater assurance that the guy providing web services for them is capable?
5. How many people here would happily accept regulation?
Still don’t get my point? It’s simple: the more we raise the “technical bar”, the less democratic the web becomes. Do we really want that? Is that good for the average (non-technical)person?
Now, I’m not against standards or validation or any of that. I personally recognize the great value in all of that. I’m just pointing out that a dose of “impurity” is probably good for EVERYONE…
@20 – Excellent point. It goes back of course to the old, _Garbage in, garbage out_ philosophy. I use “TextPattern”:http://textpattern.com to manage some of my stuff and it works great, as long as I continue to give it good code. Where I work we have a home-grown CMS built in ASP, and it is simply horrible. The code we have is backwards and anything but valid. ASP can be configured to generate Strict or Transitional XHTML, so the issue isn’t entirely the software you use.
@25 – I think the issue with Web developers is that the Web is so lenient with bad code. Browsers for the most part can work around errors and auto-close tags and such. So as for validation, I think that everyone who develops for the Web can and should be very careful with their code to make clean, structured documents. As for semantics, that becomes more of an issue of coding style. Good programmers can make something that is efficient and effective. Bad programmers can make something that is inefficient but still effective. The same for the Web. or is just as “effective” as
in the sence that a Web Browser will give the desired result. An effective solution that is both inneficient and bad.
I think the Web lacks standards because they aren’t required to achieve the end result. But democracy and free (“_as in free speech, not free beer_”) information are what made the Web great 16 years ago and are what still make it great. Regulation is a bad idea and frankly won’t work. I think it’s our job as developers to do our part by utilizing standards and advocating their importance to clients, businesses and other developers.
bq. Still don’t get my point? It’s simple: the more we raise the “technical bar”?, the less democratic the web becomes. Do we really want that? Is that good for the average (non-technical)person?
“Troy”:http://alistapart.com/comments/whereourstandardswentwrong?page=3#27 : I see where you’re going with this, but I don’t agree that validation is anathema to good design. The “Zen Garden”:http://csszengarden.com/ proves that working within a valid framework can lead to incredible, inspiring design: is valid markup or CSS a liability there?
I think that validation’s gotten this reputation for being a constraint because, well, it’s damned hard to do. And to be perfectly honest, I don’t enjoy checking my work in a validator. I’m sure most feel the same way. But as I said in the article, we need to get to a point where validation doesn’t require any intervention on our part, where the tools and software we use just _are_ standards-compliant. It’s not about raising the bar for building a valid site: rather, we need to discuss how to remove it altogether.
…you’ve got a green tick” is a common refrain around here. 🙂 Generally, I won’t even look at a CSS problem until the HTML is valid. We rely on the “HTML Validator extension for Firefox”:http://users.skynet.be/mgueury/mozilla/ – while it’s not perfect, it’s an internal “good enough” standard. The whole team uses Firefox, so the validation requires no effort; I know I’d never win anyone over by insisting they validate every single page any other way.
I couldn’t agree more with the point about the hidden cost of invalid markup (especially since, as the resident CSS guy, that cost is *my* time and hair loss). In an environment where we *do* have control of the HTML, there’s no excuse.
I guess the question is, “What’s the client’s objectives and what are the benefits of not complying with standards?”
I have very few clients that have such a homogenized target audience that they could only design for one browser (like client-side VB script for IE)or some proprietary technology.
For internal corporate projects, yeah, it might make sense, and if it shaves time off of development, then why not? But for the most part, I know my clients want to reach the broadest audience possible, which means that we have to settle on a common denominator of standards to give us a wide reach while not hampering the aesthetics or functionality of the web site.
RE #29
>>>Troy : I see where you’re going with this, but I don’t agree that validation is anathema to good design. The Zen Garden proves that working within a valid framework can lead to incredible, inspiring design: is valid markup or CSS a liability there?>>>
I never said or implied that “validation is anthema to good design”. Please, re-read my post, becuse my point is different. In fact, I’ll make it very clear: clean, efficient markup is desirable.
As for Zen Garden, it is what it is: a very good example of what can be done with CSS within the framework of a one-page site. Most REAL projects are much more complex than that and rely on many more factors…
******
Now, the crux of this argument is brought to the forefront with my initial question:
Is regulation the shortest path to validation?
I know that the overwhelming majority of answers are going to be “are you crazy?”, but that’s the idea: the whole “validation” and “standards” movement is not very objective or realistic, and many of the benfits it proposes are just dreams or rely on self-accommodating logic. A perfect example is the Zeldman-touted concept of “future-proofed” sites based on “valid” markup. Ask any of its proponents if using the tag is:
1. Valid code
2. Future-proof
3. Recommended
The answers should give you an idea of where this “validation” issue stands TODAY.
As previously mentioned one of the biggest problems at the moment is the fact that the whole idea of web standards is completely unknown to a lot of people, even to those who are currently being educated to be building websites in the future. I organized a workshop around the subject for my fellow ICT graduate students the other day and it was stunning to see how little they knew about the advantages of XHTML, CSS, semantics and the likes. Proper education is essential in our effort to really make the big step away from table based deign.
Troy: To be honest, no I’m not sure I *do* get your point.
Are you saying that if we really think validation is important we should be arguing for professional regulation – and by implication of your analogy to medicine/law, a formal ‘qualification’ before someone can legally practice as a web designer? Or for anyone publishing on the web to only be able to publish code which is valid? – for example, by desiging browsers which won’t display pages with invalid markup.
I agree either would be a huge step backwards for the web. So much of the power of the web is its low barrier to entry. Just as anyone with paper and a pencil can be writer, so anyone with a computer and free text editor (or these days access to an internet cafe and a free WordPress account) can be a web publisher.
But it is no contradiction to argue that _professional_ web workers should follow quality standards in their work, from the basics of validating their code, to the complexities of ensuring access for people with disabilities; just as there is no contradiction in saying that a professional writer should create copy which follows not just accepted standards of spelling and grammar.
# *Standards are really important.* Doing _anything_ to standards saves time and money, increases reliability and makes for more creative results by channeling energy away from the mundane thinking into the areas where we allow variation. The added challenge of cross browser issues is new to us because there are not many areas of human endevor where interoperability has not been worked out.
# *Creative goals cannot trump maintainability and usability.* While I find the ‘Garden’ breathtaking, an inspiration and it has taught me to embrace the power of modern page design standards, I am not fooled for a minute by the liberal use of text made into an image, used to cheat around the rules. Image as test content is kinda hard to maintain and change, don’t you think? We must remember that they invite _graphic designers_ to play in the garden, not -mortals- web site builders. I accept that; I still get to use the tools after they show me how.
# *Things are getting better rapidly.* The freedom we are beginning to get by being able to template behavior as well as appearance, combined with a return to programming over scripting (logic instead of wrote procedure) is making webwork more and more exciting and vital.
# *Accessability is still an issue and it should not be.* Making sites accessable is not optional. Ever. I believe that standards will eventually enforce this in non-optional ways.
Your assertion of amusement is a straw man argument – formal education in graphic design or programming is NOT formal education in sitebuilding, period, end of sentence.
Graphic design is about aesthetics and the presentation layer of information, no more, no less.
Programming is about making computers do work.
Sitebuilding is about conveying information. This involves graphic design and programming, but is a superset of both endeavors combined. A good sitebuilder also needs to have a good working knowledge of Human-Computer Interaction, Information Science, copywriting, and multimedia post-production. I’ve yet to see the curriculum that competently addresses all six fields of expertise competently and simultaneously.
And standards compliance is important to sitebuilding for the same reasons that:
* Consistency is important in graphic design
* Process, compilation, debugging, and resource conservation are important in software engineering
* Testing and measurement are important in HCI
* Formal lexicology is important in IS
* Grammar is important in copywriting
* Effective algorithms and tools are important in multimedia postproduction
The tradeoff between ‘purity’ and ‘democracy’ is also a bogeyman. The webproles are not getting paid usually-outrageous sums of money to create public information systems meant to make and/or save money for their sponsors; they are simply following a hobby or vocation. Nor do I believe that any standards advocate would wish to impinge on the participation of the unpaid. They can and should be held to lower standards until the consumer-grade tools out there are universally solid.
To answer your questions:
# The web would be an infinitesimal fraction of its current size if useragents became strict in what they receive. But that won’t happen until tools are strict in what they send, count on it.
# Quite a few people would be demoted to less impressive jobs in the face of industry regulation, self-imposed or otherwise. But uncertified though talented operators would still be able to make a living from subcontracts. The losers in the face of a regulation push would be those who are too broke to get bonded, and too lacking in professionalism to avoid censure. Given the lack of coherent and effective curricula, I seriously doubt that the lack of your precious formal training would stand a chance in the face of proven ability and experience.
# Maybe regulation would result in more common adherence to best practices, but see my comment above regarding possession of a diploma of any kind.
# Regulation with enforcement would, in fact, result in higher customer confidence.
# I suspect that there would be a direct correlation between an operator’s professionalism and his or her willingness to support a regime of regulation for this industry.
The more we raise the bar, the easier it should be for those who meet that bar to find (and get paid well and on time for) work they do for their clients. The rest can get by, and the design of the web ensures that outcome.
As for your support for a dash of bitter in everyone’s juice, it’s a crying shame you had to illustrate your point by trolling over folks’ lack of (functionally nonexistent) credentials.
I’m sorry to piss on your parade, but I’ve had a full year to focus on these issues for several hours a month, and discuss them with others in a genuinely constructive setting.
There are solutions here that let everybody win.
Best sells for me are page rank and development time. Only more savvy clients understand the bandwidth issue fully, and the conversation can get mired in the specifics of histing packages when I hit on this point.
My clients never care about device accessibility or otherwise unless they’ve approached me with this as one of the objectives in the first place; whether a Blackberry, PSP, or Motorla RAZR can make it into the site never seems to be a concern (and it isn’t much of one for me either: few people are buying camera parts from their Sanyo).
Who has dealt with some common validation errors, and what were some of the ways you worked around them? They could be technical solutions, or more process-oriented—I’d just like to hear a few war stories.
Let me begin by saying most of current clients are small business (accomodation cottages) located with 30km of my office. Most are still afraid of this interweb thing – but understand the importance of having an online presence.
My clients don’t generally give a toss whether code validates, is maintainable , built with php, asp , whether you use Word or Frontpage or handcode to generate your HTML or whether you scratch it on a rock with a chisel – as long as
1) the site loads quickly on dialup,
2) has the information/look they want,
3) is found by google (preferably in the 1st page of results) &
4) works on their version IE (often 5.0 —> common reaction – Opera, firefox, safari etc – never heard of them)
5) delivered on time
… and the eyes glaze over whenever I use any technical talk (I do pretty line in analogy now!). I’ve not yet had a demand to use say Word to build the page, and a really awful CMS, and that I violate every one of Vince Flanders design commandments (if I did I would walk away – I can usually talk them out some their design requests by showing them WPTS)
As professionals (?!) this is the stuff we do behind the scenes (think duck – to the client we smoothly deliver what they want – under water we are paddling furiously) – we need to convince those that write the HTML & php/asp/.net code for us that standards are the ‘bees knees’ – whether they are W3C or internal code standards & style guides.
The standards payoff is for US in that we will not get the phonecall that xyz is complaining because they can’t see their website using farnarkle browser or to go back and tune the site because it loads like a dog, or spend 3 days making a small change that should take no more than an hour
common obstacles to validation? has anyone mentioned IE yet?
@Troy:
I guess when you wrote “it’s democratic nature” you didn’t mean the freedom to use the browser, platform or device of choice to access the web?
Re #34
Sophie wrote:
>>>>…But it is no contradiction to argue that professional web workers should follow quality standards in their work, from the basics of validating their code, to the complexities of ensuring access for people with disabilities;
>>>>
The problem with that premise is that you can’t objectively tell who is and who is not a professional.
Was Amazon.com, with a front page that lacks a Doctype while sporting a respectable 1128 validation errors, built by professionals? I’d say yes.
And who should enforce the rules? How to enforce them? What rules?
Once again, would it be good for the public if all browsers became strict in not displaying non-validating pages overnight?
And, if some room for error is left, where do you draw the line? How much is too much?
My point is: articles like this one oversimplify and distort a VERY complex issue and that’s not good for “amateur” or “professional” alike (whatever each term means within the context of this debate!); raising the technical bar is something very delicate that requires deep reflexion by everyone.
The good news is: the web will continue to evolve and it will certainly become more complex for developers and better for surfers; let’s just make sure we are not attacking the “enemy” with a boomerang…
Sander,
that’s only part of the whole pie. There’s another equally important part: the ease to publish and become an active part of the web – similar to the blogging movement, which has given millions of people a voice they couldn’t otherwise afford.
Many benefit from this little mess: beginner web designers, beginner web programmers, small businesses, aspiring writers, the list goes on and on. They would all dissapear overnight if “standards” were enforced overnight.
How many people can afford the expertise and resources necessary to create sites that validate?
It’s easy to make a 10-page site validate. Anything a bit more complex and you begin to deal with reality: small budgets, limited human resources, browser oddities, W3C’s own mess, interaction with 3rd party content providers, etc.
It’s not an easy topic…
I validate code for HTML emails. Sure the code is not semantic, but running it through the validator before I run it through our email testing program can save some time.
I don’t know of a validator for semantics. And HTML is missing some “vocabulary” anyway, so perfection is unattainable.
In the production environment I am in, in some ways semantic code wouldn’t be a cure-all for what we do anyway. I mean, if we code a landing page with a brief life-span, what difference does it make if it is future-proof?
I’m all for sematic HTML and valid coding practices, but I do understand what the article is saying.
I guess a professional website builder builds websites for his/her profession… simple as that. And people who get paid for doing a job should do it well.
On accessibility: I think both the website builder and his/her client are responsable for the accessibility of a website. The builder for actually making it accessible and the client for demanding that it will be.
@Troy:
I understand what you mean. I’m one of those uneducated (in this area) professionals you’re talking about 😉
I don’t blame amateurs or even starting pros for delivering tag soup. But there are just too many long term professionals who maintain the tag soup standard. So it would be nice if there could be some kind of regulation for them, not bothering the beginners. But as it is a global thing I wouldn’t know how to achieve that though.
Maybe a more strict rendering would not be a problem for starters if there was a good and informative debugging tool. I mean, strict guidelines can be very helpful as well, just because they’re strict.
bq. Once again, would it be good for the public if all browsers became strict in not displaying non-validating pages overnight?
No, we’re too far down the road of browsers bodging together any old bad code to click our fingers and say “from now on, only valid pages will work”.
That could happen, when the majority of browsers support application/xml+xhtml and developers start switching to that – but as they have to make the choice to switch, it won’t bring down thousands of websites with no warning.
But … if browsers were less forgiving of invalid code, I would welcome that. It would certainly make it easier for us to convince authors of the value of standards-compliant code!
bq. Many benefit from this little mess: beginner web designers, beginner web programmers, small businesses, aspiring writers, the list goes on and on. They would all dissapear overnight if “standards”? were enforced overnight.
If easily-available tools were available that could create compliant websites, that wouldn’t be a problem. And there’s no reason why they shouldn’t be able to do that. It won’t answer the problem of non-semantic code (but that is partly subjective, so much more difficult to enforce), but it would stop plain old invalid code.
I don’t much mind going to a small business or a hobbyist site and finding that the technical stuff falls short – it’s like complaining that there’s a spelling mistake on the menu of your local ethnic takeaway. It’s when people claim to be professionals, and produce extensive websites that make no attempt at validation, that I get angry.
bq. How many people can afford the expertise and resources necessary to create sites that validate?
Who can’t? How much time does it take to validate a page? Seconds. How much time does it take to fix an invalid page? A few more seconds. If you’re even half-way competent, the only errors will be tyops and the like – and if you’re not half-way competent, you have no business calling yourself a web designer.
Stephen Down wrote:
>>>
How much time does it take to validate a page? Seconds. How much time does it take to fix an invalid page? A few more seconds. If you’re even half-way competent, the only errors will be tyops and the like — and if you’re not half-way competent, you have no business calling yourself a web designer.
>>>
If you are half-way competent you’ll understand it is not that simple, Stephen.
Go ask the people behind Google, Amazon, Ebay, MySpace (to name ONLY a FEW VERY WELL-KNOWN sites) why they are so incompetent, so unprofessional, so lacking in common sense. It should take them seconds to fix the problems, shouldn’t it?
But then again, perhpaps this is a very complex issue, way beyond your understanding. Just a possibility to consider…
Isn’t commissioning a website just like buying a car to some extent. Some cars are expensive and cheap to run, others are cheap and expensive to run. Others are still are cheap to buy and to run. Some people want a Ferrari others are happy with Fiat.
Many clients simply can’t afford to do the research and commission a professional web shop to do a professional site to standard. Or as comment #39 says, they simply don’t care. They want their site to look their competitors even though technically it sucks.
A professional standards compliant site will cost more to commssion because you’re paying for the experience of the developer/designer creating it. They don’t cause more because it takes longer or is more complicated, they cost more because of the experience of the person creating it.
I wonder how many of the standardistas would be willing to create a site for $500 from scratch including logo design and copywriting. But there are plenty of students (or whatever) around who know a bit of HTML, CSS and Photoshop who would be happy to do this. There’s no way you can compete with that and unless the student happens to care for standards it won’t be compliant.
Personally I think the web is just fine as it is. It’s going in the right direction. There’s enough people shouting about standards to make them important, but without the hobbyists creating all their content the web would be a very boring place indeed.
Forget regulation, education and awareness is the only thing that counts.
There’s no question that browsers ought to be less permissive when it comes to sloppy, invalid markup. Otherwise, it’s almost “anything goes”. This is bad not only for accessibility, but perhaps more importantly, it hinders the growth and spread of awareness regarding the value of accessible and semantic code. It also encourages developers and companies to be lazy. We need greater awareness of what valid sytax is, how to make code more accessible, and education regarding semantic markup. We can either go for the lowest common denominator (which is the approach browsers have typically taken), or we can try to raise the standards and bring everyone along. As for validation itself, I just don’t buy the suggestion that validating can be too difficult or time-consuming. Developers — professional and unprofessional alike — don’t validate because they are either unaware of standards or they don’t care enough about them to bother.
bq. Go ask the people behind Google, Amazon, Ebay, MySpace (to name ONLY a FEW VERY WELL-KNOWN sites) why they are so incompetent, so unprofessional, so lacking in common sense. It should take them seconds to fix the problems, shouldn’t it?
I was talking from the perspective of building a new site, not fixing an existing one. Fixing tag soup is never going to be easy! These sites have clearly been designed without consideration for standards or validation, so trying to fix them will not be an easy job. But if you’re writing a website with the intention of making it validate, it really isn’t difficult to do.
The majority of errors on the Amazon homepage are unencoded ampersands, with the occasional non-SGML character, missing alt text or incorrectly nested tag. Unencoded ampersands should be easy enough to fix, and that will remove most of the errors.
bq. There’s no question that browsers ought to be less permissive when it comes to sloppy, invalid markup. Otherwise, it’s almost “anything goes”?.
um, Actually, I do question that reasoning.
I’ve noticed several posters have latched on to the idea that if we make standards into rules, then developers will have to follow them. Fortunately, an equal number of posters have countered with the observation that all computing is built on the legacy of someone else’s work. Simply put: enforce rules, and you break most of the Internet.
This is why standards are practical. In spirit, a democratic group of people got together, threw around some ideas, and agreed upon some educated guidelines. The more we exercise these guidelines, the more organized we’ll all be. But no one is going to force you.
In a reasonable world, the organized group with standards would have a bit of industry clout. And they do; browser makers heed standards when building new versions (some, better than others).
But (and this is just my thinking, here) standards hold very little clout with clients. They hold no brand name appeal, like “Web 2.0” or “Flash” or “AJAX.” Until they do, clients won’t start demanding them.
“Stephen”:#52 makes a good point about legacy code: pretty hard to standardize once things are already going.
The author anticipated this with a counter argument about how time gets eaten up, debugging code not written up to standards. This is true for spaghetti code, certainly, but I’m willing to bet places like Google and MySpace have a methodology in place.
Methodologies are also like standards: not rules, but guidelines. And, independent of standards, a group of professionals can follow a methodology _as if_ it were their “standard.” Capiche?
(Actually, I KNOW that MySpace uses a methodology: Fusebox. We use the same at my job. I’m the only guy on staff who’s a standardista, and yet we’re able to fix bugs lickety-split.
Of course, I make less of them, in the first place…)
There are a number of issues at odds here. In my case, in the company I work for, the problem now is not convincing the business of the benefit of developing to web standards, it is actually educating some of the people who work in the web department here how to use them effectively. It is very easy to produce technically valid code that is just as nasty in a tag soup kind of way as any of the ‘old’ table-based websites we have all grown to scorn so much.
The problem with this is that there are a number of developers from the ‘old school’ who haven’t taken on the principles behind all this standards compliance madness that has been sweeping the web over the past couple of years. Their mentality is still stuck in the ‘structural’ rather than ‘semantic’. They dont yet really grasp that if you span everything up, and produce a million and one styles needlessly, you really dont have many of the benefits that the standards compliant web should offer. Development time and bugfixing time remain high, and page weight is just as high (if not actually higher, as was the case with a major project I took on a total code re-write for a few months ago).
One of the other things I seem to hear and read a lot these days is that web browsers should be less tolerant of poorly written code. To me, this attitude, although it may make our lives easier, is wholly against the spirit of the internet. Forcing anyone who wants to put together a web page to learn (for many, quite complicated) code languages would prove such a barrier to so many people that they would be much less inclined to use the internet to publish their views, interests, loves, hates etc etc etc. Admittedly, this situation is subsiding as so many people just use a simple blogging engine, but what about the kids that are just starting out,… Who have an interest, but only have a (fairly bad) WYSIWYG editor? I started out like that, years ago… And now I’m here as a web professional – The authority in a very, very large business on how we should be developing our platforms to web standards etc. Also, you would completely break much of the old content out there. Some of you would say that it shouldnt be there in the first place… But the people who made that content might love it, and I for one don’t want to take that away from them.
As for my employers? Like I said, selling web standards to them isn’t a problem any more. In fact, much of the time I wish they would pay less attention to the buzz and trends on the internet. They have all gone mad for ‘Web 2.ohmygod’… They want to jump on this ‘cool’ bandwagon. We even have links to add our pages to del.icio.us and google bookmarks, for god’s sake! Let me make this clear, here… We are NOT a cool company!
I like web standards. They make my life… Interesting. Internet Explorer makes my life with web standards somewhat more painful… But I can live with that. I think MY next step will be to sell web standards, and more specifically their proper use, to many of my colleagues. Its going to be a hard sell… They have been doing this for longer than I have, and that, in large, is the problem.
I can claim a win for valid code.
I persuaded a wagering company (bookmakers mainly, casino, poker as well) to use valid XHTML 1.0 Trans and CSS2 on their wagering (e-commerce) sites. We made the case on fast downloads (they value customer service), SEO (they have restrictions on advertising in most countries they operate in), quicker issue resolution, forward compatibility, use of javascript (valid DOM), easier for the marketing team to produce creative for the sites (there are currently 5 sites in the stable, all running variations of the code), ability to roll-out more websites with different brands (about 70% of the CSS is common to all sites and the rest is brand specific).
I had to go in and hand-code every single screen of the web apps, four days a week for six months; then hand-hold the IT department through integration into their ASP.NET web applications (they grumbled every time I didn’t let them use a library control). But it all works and it delivered for the end user. The sites now turn-over almost 1 billion dollars (AU) per year and we’re brought in as a team to deliver enhancements and minor maintenance.
It’s now a battle to keep it all clean with the .NET developers still under pressure to roll out library controls every week; but the sites are still largely valid and standards compliant.
This business is completely results focuseed and they are very happy with the work we did; so I reckon anyone can be persuaded to implement the basics; just be prepared for the client-side code to depart from standards and need herding back every month.
I congratulate on really great piece of writing.
I agree with Lasse, that SEO sells. Thus it is possible to force client to have semantic website as this plays a part in the overall optimization.
I generally find that most clients who want a website don’t think of it in code. They look at it visually and naturally hire a graphic designer to mock something up for them in Photoshop before hiring the web developer to “code it.”
A big mistake is for clients to make “coding” a page the LAST step. Often this will lead to shortcuts and hacks in the markup and CSS, cross-browser incompatibilities, and if working in a team; the worst scenario of all — a mish-mash hodge-podge of markup. It can truly ruin a project and makes standards-compliance a sick joke.
As an aside, another barrier I’ve found is the team itself. If there is more than one developer working on the markup you most often get one of two extremes: either the team gels and conforms to sets of conventions and markup style or they play by themselves and mix up their styles and hacks. Communication is key and requires that designers agree to standard ways of implementing the standards. When it works, it’s gravy — but most people groan when someone suggests introducing more process. Some people feel they are more productive in the “cowboy coding” method.
In the past I’ve found that I’ve been most successful when I was the sole developer and designer. I was mostly happy when I was the developer working with a designer that understood (mostly) CSS and web design. I was mostly unhappy when having to hack a graphic designer’s PSD into a use-able design. And I was miserable when I had to hack designs out over and entire site and various applications in a team using various CMS’s and templating systems.
Is it impossible? No — but it does require a certain level of organization and discipline in the process to work.
It’s actually the “anything goes” philosophy that has made the Web so powerful. Standards are great, but if browsers became less permissive, it would as someone else pointed out, break the web. I think that regulation would be the worst thing to happen to the web and internet. It was created as a medium of open expression and if we started telling people they had to follow set “rules” I think we would lose that freedom. I think businesses and web development teams need to establish best practices and enforce standards, but it’s not contingent upon anyone but them to enact these standards.
As the web grows, and users of the web move from PC’s and web browsers to mobile devices and even things like appliances in the kitchen, the need for standards and accessibility will increase, and eventually the new will suplant the old. There will be no need for regulation. The web (the USERS of the web) will regulate itself over time.
I guess the answer is DocTypes…
Quirksmode can still be the sandbox for starters. And regulations could enforce professional website builders to use only certain DocTypes upon which browsers act less permissive.
Look, most browsers already follow certain rules and standards. That’s why many tags work as we expect them to, and basically compel us to code in certain ways. If they didn’t, then why have tags at all? Why even have a language like (X)HTML at all (which is also based on rules and practices)? Asking for browsers to be a bit more strict is not the same as calling for “regulation” by some unseen government. And I don’t think it will “break the web” either. The other implication of what I read from some posters goes something like this: “let’s not bother trying to educate people who are probably incapable of understanding html or how to make web pages that will reach the broadest possible audience. Let’s just keep everyone at the lowest common denominator of knowledge and skill. And don’t bother informing and educating people about how to make their pages more accessible, or that making pages with certain programs or methods will leave these pages inaccessible to readers who rely on assistive devices.” What kind of freedom are we talking about? This reminds me a bit of Rousseau — we may need to give up some of our ‘natural’ liberty (no rules at all, a state of nature) in order to achieve a form of ‘civil’ liberty on the web — which is a form of association and yes, in some sense, a social order.
@54: You mean fusebox.org where the use of whitespace before the doctype declaration ensures that IE will bounce into Quirks mode even though it is declared as XHTML1.0 Strict (no, there’s 24 errors, so it isn’t)?
As for Google, I just don’t understand how such a simple home page can have 30 errors on it (.co.uk) and that they can’t be bothered in their 20% of hacking time to just fix the bloody thing. Even slightly. The fact that the page still works is more luck (and browsers accepting all kinds of crap) than judgement.
Send Google’s website through the W3C validator. Look at the number and kind of errors you get. Remember that this is probably the one most visited website in the world, made by some of the top professionals in the field, aiming at the highest possible crossbrowser compatibility and user accessibility.
Only a few months back I was constantly exhausted and approaching a serious depression. I wanted to be at the forefront of technology. I felt proud. I made every webpage I designed to validate as XHTML strict. All websites looked just supercool in Firefox, but shitty in at least a quarter of the browsers used by my visitors. I did not sleep much, because I tried to meet deadlines and still vertically center some liquid CSS tag-soup in IE 5 Mac. I often cried and hated to be alive for the first time since leaving puberty.
Until I looked a Google’s sourcecode and realized that I had been wasting my life over nothing.
XHTML and CSS were developed by folks who frequent text only websites. LOOK at the sites these guys output! I do graphics. Now I again do what works best. I use a mixture of tables layout and CSS today. Some of my code still validates. I sleep well. My sites look just perfect in all browsers (even Lynx).
If you think about this, you will easily understand why the movement to propagate web standards failed. Because no-one will give up eating with his fingers for a blunt knife and flat spoon.
You can validate font tags, unquoted attributes, unsemantic markup and bold tags instead of proper headings