A List Apart

Menu

Beyond DOCTYPE: Web Standards, Forward Compatibility, and IE8

Issue № 251

Beyond DOCTYPE: Web Standards, Forward Compatibility, and IE8

by Published in Browsers, CSS, HTML, JavaScript253 Comments

Progress always comes at a cost. In the case of web browsers, users bear the cost when developers take the rendering of certain authoring tools and browsers (especially Internet Explorer) as gospel. When a new version of that browser comes along and fixes bugs or misinterpretations of the spec (or introduces new ones)  or in any way changes behavior, sites break and our clients, bosses, and users get very unhappy.

We could spend hours explaining why our sites broke, but wouldn’t it be better if they didn’t break in the first place?

A little background

Building on the momentum created by the release of Internet Explorer 7, which included major advances in CSS support, the IE team began work on a completely new rendering engine for IE8—one that followed the CSS 2.1 spec as closely as possible. The culmination of their efforts is a browser capable of rendering the Acid2 test accurately. For those of you keeping track, this means that IE will soon support generated content and data URLs, and, it has been confirmed, will banish hasLayout forever. This will put its rendering on par with other browsers that have passed Acid2, including Safari, iCab, Konqueror, and Opera. (Firefox 3, which passes Acid2, had not been released as of this writing.)

Throughout the development of the new engine, the IE team has been mindful of the backlash they received upon the release of IE7. Some standards zealots and even a few Microsoft fans felt that they didn’t go far enough in IE7 with bug fixes and improvements to CSS support. But a far greater number of developers gasped in utter disbelief as their websites, which looked great in IE6, broke in IE7. On his blog, standards advocate Roger Johanssen offered three reasons for the breakage, and in their drive to improve standards support, the IE team discovered a fourth: the DOCTYPE switch, a core technique enabling modern CSS layouts, is fatally flawed as a way to protect compatibility.

The DOCTYPE switch is broken

Back in 1998, Todd Fahrner came up with a toggle that would allow a browser to offer two rendering modes: one for developers wishing to follow standards, and another for everyone else. The concept was brilliantly simple. When the user agent encountered a document with a well-formed DOCTYPE declaration of a current HTML standard (i.e. HTML 2.0 wouldn’t cut it), it would assume that the author knew what she was doing and render the page in “standards” mode (laying out elements using the W3C’s box model). But when no DOCTYPE or a malformed DOCTYPE was encountered, the document would be rendered in “quirks” mode, i.e., laying out elements using the non-standard box model of IE5.x/Windows.

This concept was first implemented in IE5/Mac two years later, and was quickly adopted by the other browser makers. Standards-aware developers were already including a DOCTYPE declaration in their documents for validation purposes, so it required no extra effort on their parts to get browsers to render documents according to the spec. Developers who weren’t standards-minded were blissfully unaware that their documents were being given special treatment because neither they nor the tools they were using inserted well-formed DOCTYPEs.

Unfortunately, two key factors, working in concert, have made the DOCTYPE unsustainable as a switch for standards mode:

  1. egged on by A List Apart and The Web Standards Project, well-intentioned developers of authoring tools began inserting valid, complete DOCTYPEs into the markup their tools generated; and
  2. IE6’s rendering behavior was not updated for five years, leading many developers to assume its rendering was both accurate and unlikely to change.

Together, these two circumstances have undermined the DOCTYPE switch because it had one fatal flaw: it assumed that the use of a valid DOCTYPE meant that you knew what you were doing when it came to web standards, and that you wanted the most accurate rendering possible. How do we know that it failed? When IE 7 hit the streets, sites broke.

Sure, as Roger pointed out, some of those sites were using IE-6-specific CSS hacks (often begrudgingly, and with no choice). But most suffered because their developers only checked their pages in IE6 —or only needed to concern themselves with how the site looked in IE6, because they were deploying sites within a homogeneous browserscape (e.g. a company intranet). Now sure, you could just shrug it off and say that since IE6’s inaccuracies were well-documented, these developers should have known better, but you would be ignoring the fact that many developers never explicitly opted into “standards mode,” or even knew that such a mode existed.

Chris Wilson, Platform Architect for Internet Explorer, has often said that one of the core tenets of development on IE is that any choices the IE team makes must not “break the web”. Sadly, IE7 did just that for quite a number of people. Unwilling to make the same mistake twice, Microsoft reached out to The Web Standards Project (of which I am a member) and to several other standards-aware developers, and asked for our help in coming up with a better method of allowing developers to “opt in” to proper standards support. The goal was to find a method that was more explicit than the DOCTYPE switch, and could be implemented in any browser, not just IE.

Future perfect

At last year’s SXSW, I had the good fortune to watch a fantastic panel led by New York Public Library’s Carrie Bickner (who also happens to be the wife of ALA’s publisher, Jeffrey Zeldman). The panel, “Preserving our Digital Legacy and the Individual Collector,” amounted to a discussion of the problems libraries and individuals run into when trying to maintain digital archives. Most of these problems stem from advances in file formats and applications: Microsoft Office 2007, for example, cannot reliably render a Word 1.0 document as it was originally intended to be rendered. The panel got me thinking about how the web has changed since its creation and how it will continue to change as web standards evolve.

As a proponent of web standards, I want to see browsers continually improve their implementations of standards while adding support for new ones, but I also see it’s important to preserve the web we’ve worked so hard to build—table-based layouts and all. Sure, most trips through the “Wayback Machine” don’t suffer in modern browsers because the DOCTYPE switch still serves them well, but what about a site built to IE6’s implementation of “standards” mode? We already know that, in many cases, IE7 won’t render it properly. Does that mean that we need to keep a copy of IE6 on hand in order to view the page as the author intended? That’s exactly what many libraries have done in order to be able to view elderly files. With IE8 on the horizon, we have the same potential problem with documents created using IE7’s rendering engine. What’s the solution?

Targeting a browser version

In an ideal world, of course, all specifications would be perfect from the get-go, and their implementation in user agents would be immediate and flawless. In a slightly more down-to-earth version of an ideal world, browser vendors would immediately integrate regularly updated standards into new user agents—and users would have instant access to the latest version of those browsers without having to lift a finger. Were that the case, we developers would be able to build sites and applications that take advantage of the latest and greatest web technologies without worrying about backward compatibility. But as we all know, the world is nowhere near even that level of perfect.

Standards are developed and advanced in fits and starts, sometimes taking several years to find their way to “recommendation” status. Browser release cycles are driven by product management and marketing concerns—security, features, speed—and rarely coincide with the finalization of standards specifications, even when the browser makers themselves have been intimately involved with the development of those very standards. And users, especially within an organizational context, are often slow to upgrade their browsers.

All of these factors leave us, the website developers, in a bit of a pickle when it comes to making websites. How do we ensure that browsers continue to render what we want them to?

We could specify the version of the languages we use, such as CSS 2.1 or JavaScript 1.5. Unfortunately, browser vendors often implement only part of a spec and the interpretation of a specification often differs from browser to browser, so any two contemporary browsers may offer completely different renderings of the same CSS or may trigger completely different events from the same form control.

With this spanner in the works, we’re really only left with one option for guaranteeing a site we build today will look as good and work as well in five years as it does today: define a list of browser versions that the site was built and tested on, and then require that browser makers implement a way to use legacy rendering and scripting engines to display the site as it was intended—well into the future.

This is exactly what our group decided to recommend for IE8, and we hope to see it implemented in other browsers as well.

Keeping the syntax simple

One key to ensuring that this browser “version targeting” was easy for developers to adopt was to make it easy to implement by hand or in an authoring tool. We considered many syntax options, including a conditional comment-like syntax, processing instructions a la the XML prolog, and even HTML profiles such as those adopted by the Microformats community, but few seemed to fit the job as well as the meta element.

Using a simple meta declaration, we can specify the rendering engine we would like IE8 to use. For example, inserting this:

<meta http-equiv="X-UA-Compatible" content="IE=8" />

into the head of a document would make IE8 render the page using the new standards mode. This syntax could be easily expanded to incorporate other browsers as well:

<meta http-equiv="X-UA-Compatible" content="IE=8;FF=3;OtherUA=4" />

In the interest of speeding up the processing of the lock instruction, it is important to prioritize the version targeting meta element in much the same way as we prioritize the character encoding information. In order to work, the meta element will need to be placed in the head of your document, as close to the top as possible. It can be preceded by other meta elements and the title element, but will need to be placed above any other elements—and you can’t add it into the DOM via JavaScript.

As those of you with keen eyes probably noticed, the meta element we are using here is of the HTTP-equivalent variety, which means we can set the following header on the server to get the same effect:

X-UA-Compatible: IE=8;FF=3;OtherUA=4

We can also use both methods in concert. For example, it is possible to set a baseline lock on a whole site using the header method and then override that header on individual pages, as needed, using the meta element.

Whither progressive enhancement?

Having the ability to lock your site to a particular browser version is fantastic for ensuring that your site will be usable well into the future, but does it undermine the concept of progressive enhancement? Will we have to alter the way we build sites? Can we still take advantage of new CSS properties automatically, as they become available? These were some of the many questions I had when we began discussing a possible “version targeting.”

For instance, let’s say IE8 wasn’t going to support generated content—if the Acid2 announcement is any indication, it should, but just bear with my use of it as an example—and we used generated content on a website that “targeted” IE8. Every other modern browser with the exception of IE would render that generated content, but even if IE9 included support for generated content, someone using that browser would not see the generated content because the site was locked to IE8. The site’s lock would need to be updated to IE9 for the generated content to appear, which goes against the core concept of progressive enhancement.

As much as it pains me to lose this particular aspect of progressive enhancement, this behavior is honestly the best thing that could happen, especially when the site concerned is public-facing. After all, we shouldn’t make assumptions about how browsers will behave in the future. If a change in IE9 would break the layout of our site or the functionality of one of our scripts, that could be disastrous for our users, sending our team into a mad scramble to “fix” the website that was working fine before the new browser launched (which is pretty much the boat we’re in now). Version targeting gives our team the ability to decide when to offer support for a new browser and, more importantly, gives us the much-needed time to make any adjustments necessary to introduce support for that new browser version.

So does version targeting spell the end of progressive enhancement? At this point, no. First of all, we will be dealing with legacy/pre-lock browsers for years to come, and progressive enhancement is a proven way to manage the differing levels of CSS and JavaScript support among them. Furthermore, there will still be a place for conditional comments to deliver style and scripting patches to IE browsers though we hope there will be a diminishing need for them over time. Finally, writing JavaScript using progressive enhancement techniques will still greatly cut down on the re-factoring time needed when preparing to launch support for a new browser.

Extra credit: living on the “edge”

For those willing to throw caution to the wind, let the chips fall where they may, or any other manner of colloquialism for coding with reckless abandon, IE will support a keyword value of “edge:”

<meta http-equiv="X-UA-Compatible" content="IE=edge" />

This option, though strongly discouraged, will cause a site to target the latest IE browser versions as they release. It is a far cleaner alternative than the inevitable hack of setting an arbitrarily high value—IE=1000, anyone? But with all of the benefits of version targeting, the “edge” value is probably not practical for anything but experimental websites. That’s because even Eric Meyer can’t predict layout or scripting bugs that may be accidentally introduced by a new browser version.

Hope for the future

For many years, we designers and developers have been yearning for a way to reliably deploy our websites. In addition to the headaches of writing cross-platform styles and scripts, we’ve had to deal with the fallout from new browser releases that inevitably broke something we couldn’t possibly have anticipated. It’s never fun explaining the cause of an unexpected break to our clients, bosses, and users. But with IE8’s introduction of version targeting, there is a light at the end of the tunnel. I, for one, hope other browser vendors join Microsoft in implementing this functionality.

253 Reader Comments

Load Comments