As I read through Aaron Gustafson’s Beyond DOCTYPE
: Web Standards, Forward Compatibility, and IE8, my immediate gut reaction was deeply negative. The version-targeting mechanism Aaron described was just wrong, completely backwards, the exact opposite of what we ought to be doing. Every one of my instincts, honed over a decade-plus of web development, was in opposition.
Why did I react that way? Partly because version targeting looked like the revenge of browser sniffing. True, before browsers supported standards correctly, sniffing was often a necessary way of coping with their incompatibilities, but it never really worked in the long run. No sooner did you finish uploading your script than a new version of an old browser came along to break it. The fragile, self-defeating nature of browser sniffing was one of the forces behind the rebellion that eventually brought standards to our browsers. And here it was, I thought, being legitimized and enshrined in the code base of a web browser.
Primarily, though, I was bothered by version targeting because it runs contrary to the principle of forward-compatible development. This has been the best practice of our industry for years now, a way-of-being learned the hard way in the browser wars. We develop with an eye to the future, using features that are widely and stably implemented and only adding “cutting-edge” features when they don’t impair use of the site—this last practice known as progressive enhancement. One example of this approach is the techniques described in “Going To Print”, which add URLs in printed pages for advanced browsers but don’t prevent or break printing in less capable browsers.
With version targeting, the incentive to plan ahead, to be forward-looking, is almost entirely destroyed. Instead, the browser makes a promise to always be backwards compatible. In effect, version targeting is like Time Machine for web browsers. The idea is that when IE10 loads up my IE7 page, it rewinds itself to act like IE7 did, all those years ago—no matter what changed in the meantime.
Thus, as a developer, there’s no need to look beyond the current state of browsers. I can just assume that browsers will always support what I’ve done even if it’s the worst kind of short-sighted, browser-specific, who-needs-standards-anyway type of development possible. And as for the expected direction of browser support for CSS or JavaScript or HTML5 or whatever…who cares?
Reality check#section2
Well, who does care? The readers of A List Apart, surely, and there are a great many of us. But as survey after analysis shows, the vast majority of web content is produced without much regard for standards-based, forward-compatible principles.
Yes, we have made great strides; yes, the work of educating developers has borne some fruit. Still, we need to be honest about this. We’re not reaching everyone, and probably never will. Some sites will be developed according to what the browser-of-the-moment does, no matter how incorrect that might be in comparison to a specification or even other browsers’ behaviors.
This creates a dilemma for browser vendors when faced with bugs in their implementations: fix it or preserve it? The classic example of this was the original implementation of height
and width
in Internet Explorer, which was wrong per the CSS specifications. The IE team at the time became aware of this fairly soon after they shipped it in IE3…and yet the problem wasn’t fixed until IE6, a delay that slowed the adoption of CSS and gave rise to a whole family of JavaScript sniffers and CSS hacks.
DOCTYPE
switching came to the rescue there, of course, allowing IE6 to preserve its old (wrong) behavior in “quirks mode” and do the right thing in “standards mode”—a mechanism introduced in IE5 for the Macintosh and quickly adopted by other browsers.
Let’s consider that for a moment. With the DOCTYPE
switch, web browsers effectively recognized two version states: old and current. There was the way things were done in Ye Olden Days, before DOCTYPE
switching, and the latest and greatest.
Speeding advances#section3
So we already have an example of version targeting in the DOCTYPE
switch. Once I came to that realization, my instincts were thrown into confusion. After all, I was a big proponent of DOCTYPE
switching, and still rely on it to this day. Did I hate this whole idea, or not?
Like DOCTYPE
switching did in 2000, version targeting negates the vendor argument that existing behaviors can’t be changed for fear of breaking web sites. If IE8 botches its implementation of some CSS property or DOM method, the mistake can be fixed in IE9 without breaking sites developed in the IE8 era.
This actually makes browser vendors more susceptible to pressure to fix their bugs, and less fearful of doing so. That’s huge, as fundamentally game changing as was DOCTYPE
switching, but on an ongoing basis. Just imagine how much sooner “height
” and “width
” could have been fixed in IE, had this mechanism been in place from the beginning.
Furthermore, if this all works as advertised, it’s eventually going to make web development a lot less reliant on virtual machines. If you need to support the current and previous versions of a browser, you just change your X-UA-Compatible
value to the older version and see how things are—no copy of VirtualPC required. That won’t happen right away, but it’s a reasonable eventual outcome.
The new sniff test#section4
We’re over browser sniffing, though, aren’t we? Didn’t somebody call it “fragile” and “self-defeating”? (Ahem.)
Well, yes, but there are crucial differences between “browser sniffing” as we know it and the proposed version targeting. For one thing, “browser sniffing” at present means “writing code to check what browser is being used and make adjustments to the markup/CSS/JS/server response/whatever accordingly.” Version targeting reverses that completely, making it “the browser checking the page to see when it was developed and making adjustments to its behavior accordingly.” In other words, version targeting frees web developers from sniffing and places the onus on browser developers instead.
That’s not a change to be lightly dismissed. Browser implementors, for all they frustrate us with (often justified) pleas of limited resources, still command far more resources and expertise in regression testing than any of us can muster. Furthermore, browser developers have a far more vested interest in making sure the version targeting works as promised and doesn’t break old sites than site authors do in updating their old sites to work in new browsers.
The benefits of hindsight#section5
The second major difference between browser sniffing and version targeting is that browser sniffing looks forward while version targeting looks back. Looking forward is one big reason browser sniffing is fragile: it’s hard to predict the future. To pick one example, Safari’s inclusion of “like Gecko” in its user-agent identifier broke a fair number of sniffer scripts—even those that were comparatively well done. The authors of those scripts had simply failed to predict that a non-Gecko browser from Apple would include the word “Gecko” in its user-agent identifier.
Now we have the prospect that browser sniffing will be done by the browsers, and will look back. This is inherently far more stable: the past is always a lot easier to predict than the future.
Besides which, we’ve written enough scripts and hacks to make our pages adjust to browsers. Isn’t it about time browsers started adjusting to our pages?
To sum up#section6
We know forward-compatible development works. More to the point, though, it’s all we’ve had. Since the inception of the web, with the sole exception of DOCTYPE
switching, browsers have been a “what I do is what you get” proposition. Developers have been forced to conform to past browsers’ behaviors while making educated guesses about what future browsers would do.
Forward-compatible development and its cousin, progressive enhancement, were necessary and proper because they were the only hope we had of sites continuing to work into the future. The mantra of forward compatibility was necessitated by the world in which we worked.
In a world where browsers had done version targeting from the outset, there would have been another option. Who knows what might have happened? Perhaps we’d find the very idea of forward-compatible development hopelessly fragile, even laughable.
We say forward-compatible development is the mark of a professional because that’s what the profession demands. With the advent of version targeting, that need may simply evaporate, rendered not wrong but moot. And though my deeply-ingrained instincts still fight that conclusion, I have to do my best to look at this possible future and ask myself if it looks better or worse than what we’ve known.
It looks better.
So in the end, and much to my surprise, it turned out that I don’t hate the idea after all. Version targeting allows browsers to much more easily develop new features and fix bugs and shortcomings in existing features, which has the potential to speed up the evolution of web design and development. That alone is reason enough to give it a chance.
Yes, but…#section7
Of course, I still have concerns.
The biggest concern is fidelity. Will the backwards-compatible code for IE8 always act exactly like IE8 did, or will there be subtle changes that still break old sites? Might there even be, dare we mention it, new bugs that affect the backwards compatibility of future browsers? After all, the door swings both ways: vendors might get lax about their backward-looking code just as developers might get lax about their forward-looking code. Talk about irony.
A small concern is the effect of version-targeting code on the size of browser applications themselves. Could this be a step toward browsers becoming bloatware? Someone will chime in with “Who cares? Hard drives are huge now!” but I remain solidly unconvinced by “resources are cheap” arguments. No matter how cheap they are, people still keep filling them up. I sincerely hope the browser of the future won’t require a gigabyte or two of storage space, chained to every previous version of itself like Jacob Marley to his past misdeeds.
Terminology#section8
I’m definitely not a fan of the “edge
” keyword. The reason for its existence seems to be so that nobody has to hack their way around the targeting mechanism with “IE=1024
” or some other large number. The problem is that providing a keyword equivalent for that creates an aura of official blessing that I don’t think Microsoft wants to give. It’s in their interest to have everyone use this mechanism, and the keyword acts as a wink and a nod to people who want to avoid it. I’m all for people hacking around the targeting mechanism if they want—I may well do it myself—but that should be done with a hack, not an official keyword.
DOCTYPE
as version targeting#section9
I wish I could be happy about the way pages are handled in the absence of any version-targeting information. If a page doesn’t have any version-targting information, then the DOCTYPE
will be used as a proxy for version targeting. For example, all the HTML4 and XHTML1 DOCTYPE
s will be targeted to IE7 by default. In the future, HTML5 DOCTYPE
s might by default be targeted to IE9 or IE10, depending on how things shake out.
Of course, a developer can avoid all that by providing an explicit browser version: an HTML2 document can be targeted to IE9; an HTML6 document can be targeted to IE7. But in the absence of explicit version-targeting information, the DOCTYPE
will be used as a stand-in and map to a specific version. From Microsoft’s point of view, this is necessary: without this, untargeted pages could be broken by new versions of IE. I get that. But it means that in order to have pages handled they way they’ve always been, essentially moving forward with browsers, you have to hack around the targeting mechanism with a really large version number—or the edge
keyword, if it doesn’t get dropped.
The biggest challenge, it seems, will be to make sure that version targeting is done in such a way that it will work into the future and not break down over time like DOCTYPE
switching did. In other words, we need to make sure it’s forward-compatible.
I guess those instincts came in handy after all.
I am all for Version Targeting because it gives my development team the ability to create stable releases. The team can push sites (or pages) live with respect to a given UA version rendering, and continue local development on the next version. When the new UA version comes, a responsible dev team should look to implement the new standards implemented (if any). I feel like we’ve been fighting for so long to have a way to preserve page rendering, and this is a great step in that direction.
My gut is in knots just trying to think about how this would change the game.
As developers, we need to be able to know where we stand and be able to test our code against something that is solid and finite. We don’t live in a world where people update their systems and browsers with every advancement. Recently we have seen how some users who make the effort to keep up force companies to offer them a way to go back after the upgrade. (Thank you Vista)
I also ask myself, can we really depend on browser vendors to keep the keys working the same way with each new release? They have already proved they can barely handle supporting their most powerful users, the web developer. Isn’t the reason we have virtual machines because some browsers are so OS dependent we can’t make a stand alone version?
We need a solution that highly encourages people to keep up and think ahead when working on the web. As more people finish school where they are educated on web standards and browser developers continue to work hard on software that adheres to the standards, we will have a more standards compliant web to surf.
No change at all is better than what is proposed so far. If the vendors give us something with good “form”, we can give the users good “function”.
…which reality this was posted from where browser vendors have the resources to maintain N versions of their engine, including security and crash fixes (consider that some IE7 feature deprecation was due to security concerns).
This is madness, pure and simple. Even if browser makers *could* implement it, what we would end up with in reality is a proliferation of targets, not a reduction:
IE10 emulating IE7, IE10 emulating IE8, IE10 emulating* IE9, and IE10
IE9 emulating IE7, IE9 emulating IE8, IE9
IE8 emulating IE7, IE8
None of these would render the exact same way. None of them. Then we get Safari 5 pretending to be IE8 emulating IE7 (otherwise it won’t work on http://www.importantsite.com which was coded poorly), and the universe implodes.
*why, you ask, wouldn’t IE10 simply load up IE7’s engine? Well, the engine API has changed (perpetual API freeze is a near-sure way to kill a product), and the underlying system APIs have also changed, and there have been many many security fixes that had to be backported, some of which probably changed compatibility (like, say, turning off DirectX by default in IE7).
pardon the formatting there; I accidentally tried to do a footnote with an asterisk, forgetting that would make it bold instead.
Forgive me if I am missing out, but surely this is overcomplicating the issue to a massive degree.
Browser developers know when they release a new rendering engine. Developers know when they test the layout of a website. Wouldn’t it be far easier, and require far less browser-version-awareness on the parts of both parties, to simply include a metadata tag indicating the last ‘build’ date of the website?
Browser engines can use a timestamp as effectively as a human readable version number, and incorporate the dates of each rendering engine change into the browser itself. Developers know, (normally as its written in blood), when a site is launching. The date at which they test the deployment of a site across all of their target browsers is thus a simple way to embed the required ‘version’ data.
Its even simpler for WYSIWYG software, that simply needs to attach the date in the mark-up, thus not requiring dialogues or assumption to work out the target browser market.
I agree with Ben Sekulowicz. If any system like this —Â which I don’t think is realistic —Â it should be based on a build date rather than browser name and version.
Best solution is for browser vendors to make their browsers standards compliant and for developers/website owners to periodically check their sites to see that they render correctly in the latest browsers.
bq. I’m definitely not a fan of the “edge”? keyword.
I have a feeling that it’s a consequence of having developers who work with Rails (Gustafson) involved in the committee who came up with this proposal.
I was surprised to hear about this, especially given Chris Wilson’s adversity to supporting two separate VMs for Javascript 1 & 2. Surely promising support within IE for throwing it into “IEx mode” (e.g. “IE8 mode”) will have the same consequences?
One can imagine the nightmare this will become 2 or 3 versions of IE down the track. I’m willing to put money on Yet Another IE-Specific Fix being proposed when the time comes to renege on the promise of indefinite backwards compatibility through version targeting.
It’s true that I’m trusting the IE team to be able to maintain backward compatibility. That may strike some as naïve, but we all trust browser makers to be able to maintain backward compatibility while advancing their standards support already. This might actually make it easier for the IE team.
But note that I say *might*. I know enough about software development (I was a “real” programmer once upon a time) to know that this will be quite an undertaking. I have to hope that they get this right–because the alternative is to have the advancement of standards grind to a halt, bogged down by the inability to fix past mistakes for fear of breaking existing pages.
I still mourn the blow this will deliver to forward-compatible development, but it won’t entirely negate the practice. We won’t be able to use it as a cudgel any more, but practicing forward-compatible development will continue to be the mark of a true professional.
Say Fx5 will be 2 times faster when rendering multiple backgrounds if compared to Fx4. But an user will not benefit from this improvement unless the author decides to update the lock-in-string from “FF=4” to “FF=5”. Why should vendors (that show no such big steps between the versions as MS actually does) apply this break at all then, if they partly loose the chance to play their trump cards – fast, standards oriented development?
Assume IE10 will be able to run under OS XII Mac/Nintendo. Will it be able to render a page that is locked in to “IE=8”? Switching back to IE8’s rendering engine does not help on a new OS. They would have to port all included engines to the new OS; alternatively, this IE10 would have to ignore all previous lock-in-strings? Or would a previously released engine like IE8 be emulated instead of included? But can an emulated engine match the bugs of the real one?
Updating the lock-in-string for a company’s site means new browser test cycles. Therefore, the lock-in-string will often be freezed in order to reduce costs, more probably than not. And why should a vendor develop an improved implementation with more CSS3 support if a growing number of industry pages is locked in to “IE=8; FF=4; Safari=4; Opera=10”? Currently, design decisions are made with the browser market share in mind. After the lock-in-invention, will these decisions be made with the latest lock-in-string-stats in mind? If so, will this cumulate to a “industry-standard-lock-in”?
Both articles have a hypothesis: that the lock-in won’t make assumptions for the future. In contrast, I think the assumption is that the lock-in will not break and that it will help improving the implementation of standards. Both remains to be seen. But you can’t go back once it is widely implemented.
Not only could this make future IEs very bloated but also slower, as they will have to switch rendering to deal with separate sites or pages. The best thing Microsoft could do (in theory) is to drop IE altogether. Freeze it at IE8 and switch to Gecko, WebKit or Kestrel etc. This will be one huge change for many people (as it will likely affect all pages written solely for IE) but not for developers who have always coded for the common browsers in use today. Once that hurdle is over it will be much easier to code for a standard-based browser that isn’t decades old and full of bugs. Of course this won’t happen because it will mean Microsoft giving up control of their browser. But IE is already a patched-up antique. How much longer can they keep working on it before it simply belongs in the scrap yard?
I have got a whole lot of thinking to do on this matter, including reading both articles over again (probably several times). This deserves a well thought out reponse, thanks Eric for providing it on this occasion, making for an interesting and balanced ALA.
I won’t hide the fact that my initial thoughts are negative, but this is already something we will have to deal with come IE8. My initial concerns are for those who code to the standards and implement progressive enhancement, those who’s sites did not break, rather improved, when IE upgraded to 7.
I’ll be back, once I’ve thought some more.
bq. Still, we need to be honest about this. We’re not reaching everyone, and probably never will. Some sites will be developed according to what the browser-of-the-moment does, no matter how incorrect that might be in comparison to a specification or even other browsers’ behaviours.
Eric, you lost me here. You seem to have bought the Microsoft sob story about all their most important customers only using clueless web developers, making better standards support business suicide. If this really is the case, then what about web standards?
Do we really want to support a new (proprietary?) syntax that actually enshrines user-agent-specific markup? Seems like an official stamp of approval for hacking it together in IEx, not checking in any other browser, and not even thinking about accessibility etc.
Microsoft is scared of newly compliant browsers (like IE7) breaking their clients’ sites — but surely this generally indicates that the sites work fine in Firefox, Safari etc., and the IE-specific hacks are causing problems for IE7?
I remember you talking at An Event Apart Philly about supporting the yet-to-be-released IE7 — you said should behave like Firefox, so as long as you were careful about a couple of now-broken IE hacks you would be fine (which in my experience was a spot-on prediction.) What changed?
This sounds like a good idea to me, but before it becomes canon, I would like to know how other browser vendors feel about it?
Also what would happen if you stated the page should be rendered in IE7 but a user only has IE6? Are we still back where we were before then?
I think the problem most developers and designers have with this is they are more than happy to update their old IE specific code, if IE fully supported standards like other browsers do.
Honestly the only real solution to everything is to have 1 rendering engine, 1 JavaScript interpreter, 1 CSS interpreter for ALL browsers, and just have a different front end for each browser. That is the only true solution to the mess…either that or only have 1 browser for all.
This really does break all that we’ve been fighting for…
We are now in a point where you actually can build a website that renders correctly across all major browsers. We should go forward with this, not backwards. We should try to elimiate altogether conditional comments and hacks.
We need clear standards, and DOCTYPES for HTML versioning. Nothing more… if some vendors get them wrong, they will fix it, or they’ll lose they’re clients.
From a marketing point of view this would really, and i mean REALLY slow down the progress of the web
Eric, If, today, this META scheme were adopted by all browsers, what would your META tag look like?
How would that page be interpreted and rendered by a not-yet-created browser five years from now? And can we specify different versions on different platforms? (eg. IE5 Mac vs IE6 Windows)?
Perhaps if I saw something concrete it would help me decide if this scheme is something I want to support.
Good question, “Carl”:http://alistapart.com/comments/fromswitchestotargets?page=2#15 . For a client site, the META value would be the browsers which I’d contracted to support. Because yes, clients always list the browsers in which the site has to “work”, and no amount of me explaining the user-agent-agnostic intent of the web is going to change their minds.
For my personal site? Probably @IE=1024@ or @IE=edge@, whichever I end up disliking less.
Why are we, again, having to fix the web for Microsoft. Shouldn’t Microsoft be fixing itself for the standard web? Why are we treating Microsoft as a charity case?
bq. I remember you talking at An Event Apart Philly about supporting the yet-to-be-released IE7 —you said should behave like Firefox, so as long as you were careful about a couple of now-broken IE hacks you would be fine (which in my experience was a spot-on prediction.) What changed?
Talking to the IE team about their post-deployment experience. As with you, I had no problems with existing sites when IE7 was released. For the crowd at AEA, I’d expect mostly the same thing to have been true. Unfortunately, that wasn’t the case web-wide. A whole lot of breakage did occur, because the fraction of web developers doing true forward-compatible development is pretty small. This has always been true. It’s always going to be true.
I’m not saying this only happens in web standards, either. It happens in every sphere of development, under every language and environment. Development happens in the moment. That means advancement is either stalled by desire to avoid breakage, or else advances are made at the expense of breakage. Of course, those who suffer breakage do not see what’s happened as an advance, but exactly the opposite.
So that’s what changed.
Thanks, Eric, but more specifically what META would you recommend for sites such as *A List Apart* or *An Event Apart* — sites that want to serve a wide audience? If I were to view the source of this page under the premise above, what would that META tag be?
…then using a *date*, as Ben Sekulowicz and others have suggested, would be far preferable.
As it stands, my standards-compliant website, which IE7 renders incorrectly, would *never* be rendered correctly by any future version of IE (a concept that only makes sense if you assume that any unlabelled page was designed for IE7).
Using a date, however, I’d be able to say “yes, I updated this page in 2010—I’m well aware that IE8 exists: render it properly, please”?.
Those who insist on clinging to the past would be more likely to realise they were doing so if they had to include in their pages to make them look right. Clients who inspected their designers’ code could say “how come this page says “˜2008′?—it’s 2013!”?, to which the only sensible response would be “I designed this for browsers that use 2008 technology”?.
I suspect (from reading the comments here) that there’d be much less resistance from standards-concerned web developers to an explicit “last-modernised”? date than to an explicit browser version; and I can’t see any drawback from Microsoft’s perspective for using dates instead of versions.
Though nothing *should* be necessary, I understand MS’s concerns about backwards compatibility. So, I oppose using versions and advocate using dates.
I understand the argument that any documents that are missing the new meta data should be rendered with IE6 (or is it 7), but the problem is that when those pages were built, the expectation was that when a newer rendering engine is released, the page would render with the new engine. In other words, these pages have not been “locked-in” by the developer, therefore the browser should not assume the they were intended to be “locked-in”.
So, how do we handle all the pages that will “break” with the new engine? When the new engine is releases, tell the developers/maintainers of those sites to add the Meta tag to their pages(or even easier have the server add the http header to all outgoing pages). Yeah, they still have to scramble to fix their broken sites, but the scramble is much easier and they can either be done with it, or update their code latter when they have the time. Besides, for the most part, the only people who are scrambling are those who were ignoring web standards anyway.
This way, we only get “lock-in” when we explicitly ask for it. That’s much better in my opinion. It also negates the need for the whole “edge” thing.
A few people have asked why we’re treating Microsoft specially, or words to that effect. What I tried very hard to do was evaluate the proposal on its merits, not on its source. I believe I’d have had the same reactions, and come to the same conclusions, had the proposal come from the W3C, the Opera team, the WaSP, or anyone else. That was one of the things I asked myself constantly as I considered the idea: would my thinking here change if the source hadn’t been Microsoft?
Eric, in comment #8 you say “but practicing forward-compatible development will continue to be the mark of a true professional.”
my question here is: why? at the end of the day, how would this professionalism show, to the end user? if you can code for the moment and effectively instruct browsers to freeze their behaviour and display to their current interpretation, now and in future, what *is* the advantage of, or even sense in, forward-compatible dev?
Patrick,
The switch only targets IE. All other browsers are unaffected. We cannot order any browser but IE to freeze. And for all practical purposes IE has been frozen for quite a while, until the release of IE7. It’s hardly a new situation.
As far as I can see the advantages of forward compatibility remain the same as always. The versioning switch is a tool for ensuring backward compatibility. It says little about forward compatibility.
Frankly, I don’t understand the argument that the switch will kill forward compatibility. It will only if we let it—and why should we?
There is no cure for backwards compatibility, web sites will break. It’s the only way to move forward.
This means no new Doctype, no new Meta Type, no Conditional Comments, no more IE propriety code. If IE8 is any where near standard compliant, let it render in standard mode the same as all other browsers.
I see that IE8 is going to have Conditional Comments?
http://blogs.msdn.com/cwilso/archive/2007/12/19/not-that-you-need-me-to-tell-you-this.aspx#7197568
IE8 is an layout engine without hasLayout. What happens when IE8 in standard mode encounters this CSS.
How many sites will now break in IE8 or a greater version in the future? How many pages does a developer have to change because they used this type of hack in the HTML?
A proceeding the allows this CSS *+html to target IE7. Why does this happen? Why can any comment appearing in the HTML be selected by IE7?
Could it be that this happens because IE7 is looking for already, ie.
bq. A whole lot of breakage did occur, because the fraction of web developers doing true forward-compatible development is pretty small. This has always been true. It’s always going to be true.
This fatalistic attitude is sad, because I believe that the efforts of WaSP, ALA etc. have played a massive part in increasing that proportion — you seem to be losing hope, just when we’ve hit the mainstream.
Don’t you think that seeing breakage in a poorly coded site is part of the learning process for people who make websites? And we have beta versions of new browsers to test with, don’t we? (Further, any IE7-caused breakage _has already happened_, so the damage is done. Surely IE7-IE8 won’t be such a drastic change?)
I strongly suspect that Microsoft’s target IE8-breakable site either doesn’t work in any other browser, or uses extensive hacks. Giving that developer an easy way out isn’t helping anyone.
How will components that get plugged into pages handle this header setting? For example, a javascript library that needs to work around browser specific bugs. Will the library be able to find out what compatibility mode the browser is in?
Same issue with a web service that’s including html in a page via ajax. Currently these components can look at the user agent – will the user agent include compatibility mode information?
Alan Gresley: Considering no other browser in the history of the web has ever been completely standards-compliant, why should we expect IE8 to be?
(I’m not talking abstract corner-cases here; I’ve had to work around bugs in every browser)
Howard Fine above has the best point so far. Why is it that we keep looking for a way to let Microsoft and IE off the hook when they should be trying to keep up with US instead?
Isn’t that the point of the standards movement? Shouldn’t we be encourage vendors to create browsers that adhere to the spec sheet more accurately and cause sites that were built incorrectly to break? (By “incorrectly” I mean that it was built with proprietary hacks and browser specific style sheets)
Aren’t we encouraging an internet that looks more and more like a collection of bad MySpace pages when we allow people use WYSIWYG builders that will lock down their code and never analyze it again?
Nothing angers me more than Conditional Comments when I see page code. If you can’t build it right, why bother?
Of course after saying all that, admittedly, developers can’t even seem to agree on a CSS 2 and 2.1 spec to begin with. The CSS2 recomendation was back in May 1998 and CSS 2.1 wasn’t until July 2007? That’s worse than IE6 to IE7… yikes!
bq. Besides which, we’ve written enough scripts and hacks to make our pages adjust to browsers. Isn’t it about time browsers started adjusting to our pages?
Adjusting based on what criteria? How are browsers supposed to know what we “meant” by a piece of HTML code? Answer: check the standards. Authors and browsers must both follow the standard, then there is no “adjusting” to be done.
The proposed Meta element switch is just like tags in Microsoft’s OOXML such as “autoSpaceLikeWord95” or “useWord97LineBreakRules”. These are meaningless without the source code.
It would be terrible to fragment the web again into pages targeted via their Meta elements to various browsers. We would be back to browsers having to emulate the bugs of other browsers.
I don’t understand why the IE team has this whole “Don’t break the Web” mantra. Legacy support to this extent doesn’t seem to be followed by any other Microsoft team or sometimes even by themselves. My belief is quite simple, *Let it Break*!
Looking at IE7 as a product, it only runs on the current and previous editions (Home editions) of Microsoft’s own operating system. People have complained about the lack of support for old versions of Windows, but it’s a reality of development.
I think that it’s far more important that we are focusing on progress more than ensuring that archaic websites continue to function in their former glory. I’m not saying that vendors shouldn’t try to support older websites, but I think that it should be passive rather than requiring active participation from web developers.
Let’s take this time that’s being _wasted_ on excessive legacy support and instead invest it into better bug testing and better development tools so that there are less legacy bugs to worry about.
I really like the idea of version targeting except for two main issues, which have been mentioned before, but I’d like to boil it down to the crux of it:
# How would browser vendors provide the support for previous versions and cross-vendor versions? Would IE9 be distributed with a complete set of APIs for IE7 & IE8 & IE9? How would FireFox render a page targeted only for IE7 and no FireFox version? If a complete set of APIs are used, then every time a new page is hit that requires another versions API, then all the libraries and symbols would have to be loaded into memory, or everything loaded on startup; that, in my mind adds up to three words: *bloat*, *bloat*, *bloat*!
# Assuming that all my concerns above are addressed and there is a great solution for them, then next thing is would the vendors be able to promote or push the latest versions of browsers to their users in a timely manner? I run a site that attracts just the average web user (non-technical), and over 40% of my IE users are still using IE6. If I wanted to target a later version, not being able to get a majority of users up to the latest version would be a great hindrance to forward advancement!
Just my 2¢.
@David Smith
Well I have some idea that IE8 is a lot better than IE7 in CSS standard compliance. HasLayout is now history! All browsers show deviation to the CSS standards to some degree but not due to working around some “internal data-structure” like hasLayout.
I serve up one lot of CSS, some with a few or no hacks for IE7 and quite a few for IE6. I have many test cases showing the same in every other browser apart from IE any version. These are not abstract corner-cases.
I’ve seen a few comments here and on “the other article”:http://alistapart.com/articles/beyonddoctype proposing a date-based meta tag vs. a version-based tag. At first blush, that seems like a good idea but I think it ends up breaking down somewhat quickly.
With a date, you only get to say “I’ve verified this site works for all browsers as of YYYY-MM-DD”, so you’re sort of implying that you’ve tested it on the most current IE, Firefox, Safari, Opera, whatever. Putting aside whether or not it’d be responsible to publish a site without having done so, it doesn’t give the time-strapped developer the option of testing in IE8, updating his meta tag, and leaving FF support at 2 even though version 3 was current as of that date (or, for that matter, publishing an update to take advantage of FF3, but leaving the IE version at 7 for now till browser adoption increases). A date-based meta tag seems like an all or nothing type of deal.
I think it’d be naive to think that other browsers would not be affected if this is incorporated. It’s the opposite of promoting standards rather it is encouraging laziness. Which approach is easier: learn to use proper markup or just set a version? Certainly seems like a step backward to ensure what is backward stays going forward.
I am thoroughly annoyed by MSFT *yet* again introducing something completely new and *obtrusive to the way _I_ work* to fix _their_ problems.
I’d like to think that when I write good code according to good specifications, my websites will only display better or the same with each new browser version.
What Microsoft is actually saying with this is *”well, uhm, we don’t know if the specifications we support now will be supported in the next versions as well”* and thats not really encouraging…
I see a problem with enumerating versions and browsers.
If you have to enumerate browsers that it is compatible with, then what does it mean when a user uses a non-enumerated browser? What does it mean if I specify IE=10, does it work in IE9? How about IE14 when they ported it to Linux and a bunch of the old code had to be removed?
I don’t think this is practical.
In addition, are they going to leave the whole IE7 engine in place? Or will the emulate it? Probably emulate it and you can be sure it won’t emulate “right”.
Microsoft is pretty good at maintaining backwards compatibility (to the point of emulating stacks in memory so that Wins32 programs that did scary and horrible things behind the OS’s back would work in Win95). However, they have to bite the bullet. I think this is one of those times. IE6 was very very broken and not fixed for way too long.
The advantage to making their browser work as best as possible with the standards is that users who aren’t professional will start to write pages that degrade better; not because they understand or care about that, but because the standards help you do that.
The more microsoft promotes proprietary non-standards compliant features the more pain the users will feel. Well, unless they can end up being the only browser; then they’d be the de facto standard.
Ciao!
Eric Meyer (from comments): “I have to hope that they [Microsoft] get this right — because the alternative is to have the advancement of standards grind to a halt, bogged down by the inability to fix past mistakes for fear of breaking existing pages.”
Utter nonsense. Just because MS have painted themselves in to a corner with their Machiavellian behaviour of the past, it’s not going to stop Firefox, Opera, Safari or anyone else pushing forward with quality, standards-compliant software. If anything, this latest debacle from MS will *speed up* the adoption of standards!
This entire proposal of version targeting shifts the responsibility (a.k.a. ‘blame’) away from Microsoft and on to developers and users. It adds another (potentially enormous) layer of complexity purely to make MS look not so bad.
If websites are broken by a new release of IE, we do what we always do – ignore it, live with it or fix it. The same applies for the other browsers … although the degree of breakage in the other browsers is usually minimal because they have made every effort to adhere to W3C standards (Microsoft just said they would 10 years ago, but ‘forgot’ to deliver on the promise).
Did WaSP have an office party recently and did someone slip a little something in to the punch bowl? Just wondering….
I know many of us say we hate using CSS hacks but I found them to be quite useful. I wish each new browser would incorporate an OFFICIAL hack that is unique for that browser. I also which we could incorporate official hacks within DOM scripts as well.
Official hacks would have two benefits: 1) We would be able to create new websites that take advantage of new browser features AND still work properly (without the new features) in older browsers; and 2) It would provide a fail-safe to accommodate any found bugs that will undoubtedly be discovered within any new browser.
The version-targeting scheme described in the article does allows us to build sites that take advantage of new browser features and support older browsers. It also does not provide any work-arounds when we discover any new bugs within a new browser.
One other benefit of having official hacks is that they would no longer need to be called “hacks”.
We’re talking about code-forking, aren’t we? Raise your hand if you’re a big fan of this practice.
I assume most pragmatic developers familiar with the practice are *not* big fans. Are we really going to ask the developers of IE(x+15) to back-build 15 or more rendering engines into their product, then maintain them all? FF and Op and Webkit too? Is this practical?
Like Eric, I’m naive enough to think that browser developers will eventually get it right — and even *more* naive that spec writers won’t make irksome changes — I code to spec, and hack as minimally as necessary to function in the Real World. Many of you do too. If our hacks are elegant, they simply dissolve, and our sites, well, get progressively enhanced.
The DOCTYPE is a way of telling the world you’re building to a spec, not a browser. Some people, in telling this, lie.
Microsoft: When they lie, it’s not your fault. I’ll let everyone else tell you what *is* your fault, but suffice to say, your DOCTYPE switching doesn’t break the web… it’s broken by those of us who’ve duct-taped our code with esoteric browser dependencies and hacks. Maybe you inspired our bad behavior, but you didn’t make us lie. We can read specs. We can, and have, found ingenious ways to deal with bad browser behavior while respecting the specs our DOCTYPEs are selling.
Hacks (code-based and carbon-based) are revealed when you change code. I’m okay with that. Code-fork for different DOCTYPEs when the time comes, sure, but not for your own rendering and behavioral engines. We’ll adapt. We always do.
I like the CSS hacks idea a lot. It would degrade and would be nice to have….
internet-explorer v7 * img { border: blah blah; }
or
firefox * .button { -moz-specific-element: }
Seems much nicer than the meta hack.
The selector could be above the document:
….
Ciao!
Grr… The preview seems broken:
* internet-explorer * v7 img { border: blah blah; }
* firefox * .button { -moz-specific-element: }
Ciao!
I give up. The bold is because I’m using * characters. The preview shows it the way I want it, but it’s not displaying right.
Ciao!
This seems like a good way to keep folks from ever developing for the newest version of IE.
What happens if I lock my site to a version of IE that is higher than what a user has installed? Then I have a big mess, and I haven’t gained anything by being able to specify browser version. To get around this, I would have to specify “edge” and keep developing the same way I do now. But those other developers with the non-professional coding practices will just choose the lowest common denominator browser version and never update it. It will be an IE 6 web forever.
that I would disagree with Mr.Meyer.
The purpose of the doctype, as I see it, is to converge the morass of browser versions and standards implementations and make them manageable. A bit like herding cats but herding them all to a point on the horizon where standards compliance for web pages and browsers is the norm. Having a quirks mode and a standards mode, duplicitous as it is, seems far saner than continually diverging.
At some point, maybe 20 years into the future, when FF400 and IE9 hit the streets, browsers and OS’s will be forced to maintain absurd levels of backwards compatibility trying to accommodate every unclosed
element ever written. Why should millions of poorly written web pages hold an entire industry hostage? I don’t expect my car to run forever without repairs, why should I expect a web page to?
Use case: I author a page today for IE7 and FF2.x. I specify these as my targeted browsers in the meta element. Five years from now, IE8 has properly implemented a few more CSS rules, and FF300 has again broken new ground. I want to use the new CSS rules and features. Do I re-author the page and update my meta element? How is this any better than where we are today?
Use Case: I have thousands of pages authored to IE7. IE9 just came out. Now I have to trust that MS won’t drop support for IE7. How long can we reasonably expect browser manufacturers to continue to roll up support for every past browser version into their new releases? And what does this mean for browser performance?
Right now I am having trouble seeing how authoring pages to a specific browser version is a step in the right direction.
A web developer should not optimize his works for a specific version of a web browser, but make it compliant to a standard — after all, that’s what they are made for.
If the development team of the IE3 realized shortly after the release that there was a bug with the their width/height calculation, why didn’t they fix it as soon as they could? At that time, nobody was writing web sites adapted to IE3 yet. There is just no need to stay backwards-compatible with bugs. If a browser vendor knows about a bug, why can’t he simply write a knowledge base entry “folks, there is a bug, it will be solved soon, until then, just wait or use some workaround”? With that approach, everybody realizes that he will have to remove his workaround and replace it with the real code as soon as the bug is fixed. This, of course, would require faster bug fixes inside the rendering engine, waiting for the next major release is not the way to go. But is that really that hard?
I would support a meta tag giving information about which HTML version this page was coded for, and every browser vendor fixes all bugs until he arrives at the point where he completely implements the standard. If you rely on non-standard behavior of a browser, the trouble will not end soon.
Rendering-engine targeting already exists: it’s called DOCTYPE! That method alone lets us tell the browser how a page is supposed to render. Why should we replace standards with multiple, branded “standards”?
The place where I can see “version targeting” being useful is in hacks, much like a standardized form of what conditional comments do today. That way developers can write code the way it’s “supposed” to be as per the standards, and add extra code to fix the rendering in whatever browser(s) they choose to fix.
However, software being the way it is, browser bugs are always going to happen, and content developers will always have to work around the limitations of the browsers of their times. The problem then, is which browser should developers work on? The theoretical, bug-free, standards-compliant browser in their heads? Or the real, bug-ridden but computer-powered browsers in their PCs?
How utterly depressing! And Eric Meyer is the one who used to prevent his pages being viewed by non-standards compliant browsers (e.g. my old IE5.1 Mac). Shame!
To me this sounds like “vlek op vlek” (phrase from an old Dutch detergent commercial meaning “stain upon stain”). MS again invents a new hack to work around their previous ones. And for the looks of it this won’t be their last.
There are several problems with this approach, most of them already mentioned here:
* Progression will be stopped/slowed down as developers can not define IE8 as the engine to be used when a considerable amount of users still uses IE7 or even IE6. And since ‘traditional’ progressive enhancement won’t work anymore, website will not likely be making advantage of features in later browser versions. This will hurt other browsers (non-IE) as well, especially since IE has such a large market share. Unless perhaps…
* The ‘new’ progressive enhancement would be something like building enhanced versions of the site for enhanced target browsers.
Sounds kind of old school to me and it sure is not what progressive enhancement is really all about.
* Browsers will be stuck to all their previous render engines. This will probably make them big (and slow), especially when new engines will introduce new APIs.
* This will prevent browsers that use this hack from ever being ported to a previously unsupported OS, unless the old engines will be ported to that OS as well or the old version support will rely on emulation, which will never be sufficient, as it’s mostly bugs and quirks that make it so specific.
I can see the points of both sides of the argument, but it seems to me that if we want to maintain the idea of progressive enhancement, we should be able to either not include the meta tag at all, or include the meta tag with “IE=edge” (and perhaps “OtherUA=edge/10000”?). I guess it depends on whether browser makers will assume that the absence of the header means that it should stick to a particular version, or it should fall back to the DOCTYPE switch (which isn’t much different from what we are describing here).
I really don’t see how this is going to change the workload of diligent web developers. As new browser versions are created, we’re going to want to regression-test our sites anyways, whether to ensure that updated spec compliance isn’t buggy, or to ensure that outdated “features” are still supported as before.
What this lock would permit, however, is a longer transition period for web developers to deal with bugs in these browsers. If the current version of IE is 8 and I am relying on its implementation of a certain feature, I could add in all 40 of the sites I support. I can start working on bug fixes with the final release candidate of version 9, but if I can’t fix everything before IE 9 comes out, I can _hope_ that Microsoft didn’t break the browser fork for their most recent public version. Once I fix the bugs, I can update the version number. The development timeline is now more under my control than Microsoft’s.
Either way, however, web developers who aren’t keeping track of browser developers are likely to be left in the lurch with or without this tag.
@Michael Landis:
Correct me if I’m wrong, but the way it is proposed now you can not really update the version number as soon as that browser comes out. All users that have not yet updated their browser will not have the specified render engine.
I believe you’re right. It seems to me that if a user hasn’t upgraded their browser, but the browser baseline has increased (such as an IE 8 user who hasn’t upgraded to IE 9), then they will be using IE 8 while the meta tag may eventually read “IE=9”.
We deal with multiple browser versions today anyways, with graceful degradation and progressive enhancement. I don’t think we will ever really get away from not supporting older browsers. This just gives us a way to gain more time to test new browsers.
bq. And Eric Meyer is the one who used to prevent his pages being viewed by non-standards compliant browsers (e.g. my old IE5.1 Mac).
I think you must have me confused with someone else.
I agree with Landis, that this may give us all more time to test newer browsers. It’ll be up to us when the switch occurs.
I remember when IE7’s pre-release was available and we were all testing our Web sites with it. Some were trying to figure out how to uninstall it afterwards to return to IE6! All that’s history now.
We weren’t sure whether the final release would provide the same results, but we were preparing anyway. Anything to avoid the shame of being the first to reveal some new browser bug, live on production…
I think this form of version targeting is not a bad idea considering the alternatives, which is either no fixes to rendering bugs in IE for the foreseeable future, or using some other mechanism such as conditional comments, XML processing instructions, or overloading the MIME type somehow, all of which would have been awful hacks.
But I don’t think this should be used by the other browser vendors unless they actually need it (and it looks like they feel like they don’t, so far). And I really hope version targeting (or the lack thereof) only affects fixes to bugs in IE that are actually problematic with respect to backwards compatibility, and not improvements and other kinds of bug fixes. Which is to say, I hope this *isn’t* going to replace virtualization for cross-browser testing—you won’t be running IE6 just by version targeting IE6 (consider things like security changes or text rendering improvements).
If that is the case, you won’t actually need version targeting for every page or site, but only for those that actually trigger some rendering bug in IE7 (okay, those are still quite a few, unfortunately). And hopefully, IE8 will be on the level of the other browsers with respect to standards support (die, @hasLayout@, die), and then there’ll not be much need for version targeting beyond adding the “IE=8”? boilerplate.
More in a “post on my own blog”:http://www.cmlenz.net/archives/2008/01/rendering-mode-switching-reloaded “¦
I have troubles following your views on how the proposed version targeting is going to stimulate browser vendors to fix bug sooner.
You give the example of a bug in IE8 being fixed in IE9 without fear of breaking up a site from the IE8 era. But if version targeting was used rendering will be correct so there’s no incentive whatsoever to fix anything. Both clients and designers not bothered with standards will see no further use for change: it works fine, so why bother fixing anything and why even bother with those nasty standards.
In turn I’m afraid Microsoft will like that attitude too. It gives them the opportunity to go back to the old days were IE6 barely had any competition and little or no improvements had to be made, and little recources had to be put into the further developement of it.
If they can make sure everybody uses version targetting in future where’s the need for fixing bugs in future versions?
We all seems to forget IE7 and soon IE8 came to be due to market share, not due to concern about advancing their browser or bugfixing. If it wasn’t for the rise of FF we still would be stuck with IE6 today.
So I’m having nightmares about just the opposite happening: advances will be slowed down.
please tell me I’m wrong so I’ll be able to sleep a bit better… 😉
Peter-Paul Koch wrote: _”The switch only targets IE. All other browsers are unaffected.”_
I was under the impression that Microsoft want all other browser makers to adopt this policy too. So you’d have an Opera, Firefox or Safari with rendering engines from the past included to deal with old pages.
Douglas Tondro wrote: _”Nothing angers me more than Conditional Comments when I see page code. If you can’t build it right, why bother?”_
I built a site 100% to standards (XHTML Strict etc). It displayed perfectly in IE7, Opera, Firefox and Safari. But when I tested it in IE6, the layout was broken. So I _had_ to use Conditional Comments to tweak it for IE6. I believe they are an excellent method of applying code to different versions of IE.
I had some thoughts and “posted them on my blog”:http://www.unintentionallyblank.co.uk/2008/01/23/version-targeting-for-ie8-developer-wars-my-thoughts/
What do you think?
Your “cascading Style Sheet Programmer’s reference” book has served me as a my bible for many years. I have (had?) a lot of respect for you. You’re one of the last person I would have expected to buy into this Microsoft BS.
Sorry, but I have to call it what it is…
What’s got into WaSP and ALA?!?! This discussion has nothing to do with Standards and everything to do with *Microsoft* market share… It’s legitimate of them, on a business level, to worry about it and make sure old sites can work with IE8.
But how can you guys agree that Standards compliance should NOT be the… huh… STANDARD. Now, thanks to Microsoft, Standards compliance becomes “edge” development? And according to gustafson article, it’s “strongly discouraged”… WHat?!
Did Microsoft send booze and exotic dancers at the meetings, and made you sign this article while you were under the influence?
What is going on???
I’m not real sure about this quite yet. What’s going to happen to the old sites we have? Will it default to “edge”?? That could be pretty bad.
I’m unclear on why Microsoft feels like they need to re-invent to wheel when they release a browser (using Trident and now a new rendering engine). Why don’t they just use Gecko or Webkit to really create some consistency across browsers. Maybe they have a financial obligation to fight open source with every fiber of their being”¦ I don’t know. Seems a bit ridiculous to me — but the almighty dollar tends to rule.
I have a question.
What are the differences between IE7 and IE8 that make this switch necessary?
I don’t know about the web as a whole, but the business customers (enterprise) all deal with either use Firefox (minority) or IE6 (majority). IE7 is only a teeny tiny part of the mix. For the most part I don’t do any hacks for IE7; I let it render like Firefox, Opera, and Safari.
If the changes between IE7 and IE8 aren’t that big, then why bother with this? Microsoft already made the huge jump from IE6 to IE7.
Ciao!
I wrote:
And Eric Meyer is the one who used to prevent his pages being viewed by non-standards compliant browsers (e.g. my old IE5.1 Mac).
And he replied
I think you must have me confused with someone else.
Not if you are the guy who wrote the O’Reilly book on CSS. Your website has changed, but didn’t it used to have bleeding edge css demos which used css hacks to detect browsers that couldn’t render them, such as Mac IE5.1 and set display to none?
I don’t know about everyone else, but I’m incline to give Eric Meyer the benefit of the doubt on many web topics. I’m skeptical on the whole thing, but I’d like some more information before I pass judgment.
Besides that, if IE8 is anything like IE7 this will be irrelevant since we’ll all be testing in IE6, IE7 and IE8 until “Back to the Future II” is set in present day (2012).
@David Leader: Bleeding edge CSS demos are the sort that are likely to break or fall apart horribly in older browsers such as IE Mac 5. Browser sniffing was probably introduced to stop a flood of emails to say that certain demos don’t work in certain browsers.
@Tim Wright: the default is to set such that all websites without a DOCTYPE will get quirks mode, if you have a DOCTYPE you will get IE7 standards mode and if you have the DOCTYPE and the meta tag you get to choose your target.
I have had some more thoughts on this though, since finding out that the HTML5 DOCTYPE _won’t_ cause a default to IE7, rather keep development on the edge. “This discovery made me much happier with the whole situation”:http://www.unintentionallyblank.co.uk/2008/01/24/version-targeting-html5-and-the-other-browsers/ . Now I believe that the whole thing is just a necessary evil for Microsoft and it needn’t effect us developers.
“The way I see it”:http://meyerweb.com/eric/thoughts/2008/01/23/version-two/?caught_as=moderation#comment-304820
All software has defects. To think that because you use this meta tag today that you are safe “now and forever” just gives you a false sense of security, nothing else. There is no guarantee that when IE9 comes out that it won’t break the IE8 rendering in a fundamental way. If you believe that’s the case, I have a bridge to sell you …
Given that software defects *just are*, why should we have to work around software defects in IE8’s rendering, then again in IE9’s IE8 rendering engine, ad infinitum. A defect is a defect, they should be *just fixed* as Firefox has done and is continuing to do in Firefox 3. For heaven’s sake, we haven’t needed this for Firefox/Opera/Safari, why, *WHY*, do we need this now?
You go ahead and use this tag, but when your site/application doesn’t work in IE12 because you decided you didn’t need to make your site/application compliant with the spec because being compliant with IE8 was *good enough* and IE12 broke it anyway, I’ll be happy that my sites/applications didn’t need this tag because I stuck to the standards.
bq. This is madness, pure and simple. Even if browser makers could implement it, what we would end up with in reality is a proliferation of targets, not a reduction:
IE10 emulating IE7, IE10 emulating IE8, IE10 emulating IE9, and IE10 IE9 emulating IE7, IE9 emulating IE8, IE9 IE8 emulating IE7, IE8 […]
Add HTML versions and DOCTYPE switching to that equation, and you will end up with an astronomic number of combinations.
*Eric* checks himself and rationalizes this proposal with the *_dangerous assumption*_ that the push forward in *_standards development would continue even after we have this oh-so-easy-to-abuse ability_* to freeze pages in time *_by doing absolutely nothing_* to trigger it.
Sorry, Eric. _Better have a doctor check you out_. When you begin to *_ignore your instincts_*, it’s a sign that the bone in your head that tells you “it’s stupid to *_step in front of an oncoming train_*” is obviously *_broken_*.
Look, we have these *_gut feelings_* for a reason. We feel *_relieved and satisfied_* when we are *_sure_* that the situation which is tingling our “spidey-sense” has been thoroughly analyzed and we are *_confident of our conclusions_*.
If you’re still getting an *_icky feeling_* inside, it’s because you’re *_not really fully convinced_* in what you’re telling yourself. That should be your cue to *_delve deeper_*.
So, you’re thinking, *what is the worst that could happen if they do this*?
“A Glimpse of the Near Future”:http://www.webstandards.org/2008/01/22/microsofts-version-targeting-proposal/#comment-59762
*THIS IS AN ATTEMPT TO UNDERMINE OUR STANDARDS AND SILENCE OUR DISSENT ONCE AND FOR ALL*. *DO NOT LET THIS PROPOSAL GO UNCHALLENGED*. *DO NOT ACCEPT THIS BLEAK FUTURE FOR OUR WEB*.
Our feedback was *_not_* solicited. The community was *_not_* consulted. We were *_not_* involved. This was done *_covertly_* with the participation of _self-proclaimed gurus_ and _individuals_ *NOT REPRESENTING* the web development community at large *NOR* the official position of the Web Standards Project (WaSP). WaSP has *_made public_* this *FACT*:
# “WaSP Statement and Discussion”:http://www.webstandards.org/2008/01/22/microsofts-version-targeting-proposal/
# “Disclaim of Responsibility by Andy Clarke – Co-lead of WaSP”:http://annevankesteren.nl/2008/01/ie-lock-in#comment-6376
*DO YOUR PART TO PROTECT THE OPEN WEB*. *BOYCOTT THIS ATTACK ON CHOICE, INTEROPERABILITY, AND STANDARDS*.
Thank *_you_* for *_your generous devotion of your valuable time_*.
*_We owe it to ourselves:_*
“Alternatives NOW”:http://blogs.msdn.com/ie/archive/2008/01/21/compatibility-and-ie8.aspx#7228772
Give them something to do, other than coding the original proposal!
My first thought when I read these two articles on ALA was they’ve been bought by Microsoft??!
Seriously. Pages are coded to standards, the browsers display the standards as correctly as they can [be bothered to]. This way the standard, the one specified in the doctype, is the common point about which the markup and renderer combine.
There seems to be a bit too much talk of people thinking that the IE6 rendering engine was perfect (!, who on earth??) and about catering for the poor sops that can’t run a standards compliance check.
This may make it easier for MS (until as others say in a couple versions time they go off some other half-assed direction) but I can’t see who else.
What particular worries me about these two articles is their common assent – surely the authors would like to change someting about the method of implementation.
Why can’t MS apps just look at the doctype and choose the renderer they think might manage, Webkit maybe!
This used solution is just as bad if not worse than “made for IE4 best viewed at 640px”.
Please Google these three little words. Intention is everyting. And Microsofts intentions have always been clear, and even though the words coming out of Redmond have changed in the past few years, the actions haven’t.
Any experienced developer can tell you version targetting is an illusion and will break soon. The people at Microsoft are not stupid (no matter how much i hate it, IE is a big and complicated application made by some of the finest developers out there), they know this. They’re not trying to solve the standards issue, they’re selling an agenda. And it’s the same agenda Microsoft has always been selling: the market sets the standard and we control the market.
Yet again, they are manipulating the market to slowly bent to their ‘standards’, and yet again, many people are falling for it. After all, it’s just one little tag, and it solves so many problems. Conveniently forgetting that those problems have been *deliberately* created by the very same people that offer this ‘solution’.
A compromise would have been to have a temporary hack that does the same thing, with the total, complete and irrevocable commitment this kludge would be removed from the next version of IE. That would give the ‘IE only’ part of the web more then enough time to prevent future breakage.
Instead we now have a MS and a cabal of prominent web standards opion makers selling us the utter fairytale of version targetting and encouraging the entire web to adopt an MS-only ‘standard’, in what seems to be a very well coordinated effort.
I read about a lot of people worrying about IE12 or so. IE7 came 6 years after IE6. How long before 9, 10 and 12?
Besides, I miss the point of some commenters (not all though). If lazy developers would use this to render all pages like IE7 forever, what’s the difference with them using tables and maybe even font-tags like those same developers do now?
Maybe MS will drop the IE7 rendering by the time they reach IE10. Well, which one of you ever built a site he or she thought would be online for the coming 20+ years? Which company is still using the same website it was using in ’96? Isn’t it a little naive to think browsers will support more than 3 or 4 versions backward?
By the time MS reaches IE10, IE7 will probably render in Quirks mode or something like that. If you don’t like the new tag, don’t use it. MS will probably insert it anyway. If you do use it, you can be _almost_ certain your site looks the same for the coming 5 years or so, which is already a huge relief for me.
Maybe this is not _the_ solution, but maybe it’s worth considering. No one knows how it will turn out exactly..
@Richard Reumerman:
“Maybe MS will drop the IE7 rendering by the time they reach IE10. […] Isn’t it a little naive to think browsers will support more than 3 or 4 versions backward?”
After this switch is implemented in the propose way, the web will break as soon as MS drops the IE7 rendering. Why? Because all the broken pages will still be broken. That’s exactly why a ‘switch’ should be inserted in broken pages to fix them instead of standards compliant pages.
You don’t fix anything by altering all the parts that are not broken, but by altering the broken parts.
The proposed implementation with the default IE7 rendering shows that the argument of ‘not breaking the web’ is not the actual reason, because everyone will understand that MS will not support this IE7 rendering forever.
This is just a short term solution. Not a good one.
The first thought that comes to mind after reading IE8 emulation is:
IE8 emulating IE6 = easy way for developer to make little or no effort in adopting standards.
This seems like a way for Microsoft to take one step back from adopting standards when they should be taking a step forward. unreal…
*_We owe it to ourselves:_*
Alternatives NOW!
See here for a breakdown of the problem and start helping out on an alternative solution:
“IE Blog Comment”:http://blogs.msdn.com/ie/archive/2008/01/21/compatibility-and-ie8.aspx#7254512
*Web developers unite*!
I’m not sure I agree with this… I don’t exactly like it, especially since someone would need to update the string specifically for each new browser release just to test. Not only that, but considering the fact that IE8 passed the Acid2 Test at one point and no other build of IE before that had passed, I would wager that multiple rendering engines might need building.
Along with that, I’m thinking about 5 years from now where CSS3 is (hopefully) complete and the latest support for CSS 2.1 as implemented in something like Firefox 2.x is not quite correct anymore. If that is the case, it makes any future browser releases that support such a ridiculous “feature” bloated and wastes a developer’s time trying to make it all compatible while still supporting new things.
In addition, consider the quirks mode vs standards mode box models. Add this version targeting mechanism into the mix, and you’ve got a gigantic codebase, not to mention the fact that this whole idea just makes things more difficult for aspiring developers who want to create a browser.
That’s my thought on this whole thing. I thought the same thing Meyer did when I finished Gustafson’s article – this helps our current situation, but what about the future? Forward compatibility is ensured, but that means that we can stay with those browser versions until something like XHTML 2.0 comes along, and we update to that causing our sites to completely break. If we wanted that to happen then we would have ignored the W3C’s effort to standardize the Web.
Sometimes I wonder how people come up with such crazy things, expecting everybody to say, “That’s a wonderful idea,” without thinking ahead first. I’m not sure about anybody else, but I was always taught to look before I leap.
I’ll admit that I don’t have any answers. But I do have a few comments. Until last month, only one of my computer clients had a screen size larger than 1024×768. Now three of them do. Most of my clients over the age of 60 (retired and spending money) set their screen resolution to 800×600 to get that large type, easy to read, effect. I have the current version of Firefox (2.0.0.11) on five different computers and web pages look different on all of them even though the screens are all 1024×768. Ubuntu wants to be completely Open Source so of course it doesn’t have Microsoft fonts installed which is the first area that is noticeably different. Firefox may be the most standards compliant browser but I seem to be able to get it to break without any trouble. I installed a font it didn’t like and it broke the spacing on some HTML entities until I removed the ‘offending’ font.
Opera has it’s own problems.
Yes, I design some basic web pages. The first thing I tell my clients is that their pages will look different on every computer and in every browser. Sometimes I have to show them on my computers before they will believe it. They almost never look at anyone else’s computer or use a second browser. I have every browser I can get on Windows, Linux, and Mac. They send me pages in Microsoft word using fonts I’ve never heard of and want me to duplicate it/them on their new web site. They tell me “It works on my computer.” and I’m sure it does. Course, sometimes they want me to do pages that resemble random video noise and wonder why they can’t read the text in the middle of it. Maybe the idea for ‘captchas’ came from one of those pages.
I have concluded that you never get ‘clean’ solutions to real problems. The best you can hope for is the ‘least dirty’. While I’m glad that people are looking out for the future of the web, I kinda feel like you’re in an ivory tower and I’m on the ground. I hope that the future you come up with doesn’t disenfranchise the people that I know.
I reread a part of the article by Gustafson and came to this conclusion.
This solution is not offered to us programmers who follow standards, *it is offered to those who do not follow the standards*, or at least not strictly.
Because of this it is obvious a lot of readers here don’t like it, because it is of no use to them and just helps those who are seen as _lazy programmers_. By the purists anyway.
_Please note_, this is no offence to either kind of programmer but just an observation. I think anyone who doesn’t like this tag shouldn’t use it, it is offered to help someone else than you.
Try to think of yourself as a major producer of a web browser like MS. You have to make things work one way or the other, so why not use the standards? They are designed for it.
However you do have a lot of users who are kind of ‘depending’ on earlier mistakes you made for their websites to work. Since you are a company you want to keep making money out of them, so you offer them a solution like this.
Certain people don’t need the solution, they don’t like it so they criticize it and won’t use it. You ignore them because this thing wasn’t meant for them anyway.
And that’s what’s going to happen I guess..
Say this is a backward step, or you want to use “edge”, or whatever… what happens if you just ignore it completely, or use it to lock only compatible browsers while you bug-fix for the latest version? If we ignore the tag is IE going to choke on us?
I’m in favor of a meta-tag like
meta name=”build_date” content=”28-1-2008″
If documents don’t have this it defaults to the date this rule gets introduced (kind of like PHP time() to 1970).
So we can build to (present-day) specs, put the date in the meta-tags and not worry about the site breaking when a new browser releases.
@Sander Aarts
“Browsers will be stuck to all their previous render engines. This will probably make them big (and slow), especially when new engines will introduce new APIs.”
Browser-vendors can make up their own mind if they want to offer rendering for 3 previous versions or 20. But if they do 2 at least most websites won’t break, which was what this whole discussion started about.
If IE8 would render IE7 and IE6 like they were intended it would save a lot of time.
@Christian Holtje
“internet-explorer v7 img { border: blah blah; }”
bad idea. When ie8 does something different, you would have to fix it. You might no longer have ftp-access to that domain or the time to fix all that.
@David Cocuzzi
“Use case: I author a page today for IE7 and FF2.x. I specify these as my targeted browsers in the meta element. Five years from now, IE8 has properly implemented a few more CSS rules, and FF300 has again broken new ground. I want to use the new CSS rules and features. Do I re-author the page and update my meta element? How is this any better than where we are today?”
The difference: CHOICE!
Developers have the choice to update their sites to latest browsers and css, or let the oldies render as they were intended, thus extending their lives a few years untill most users have later browsers installed.
And if you realy want or need to update, your deadline for updating old sites becomes more flexible.
@Sander Aarts
“You don’t fix anything by altering all the parts that are not broken, but by altering the broken parts.”
But the old sites were not broken, they were made broken by non-backward-compatible new browser versions.
“altering”? No need to alter, just optional to use in sites to be build in the future.
@Ben Sekulowicz
“simply include a metadata tag indicating the last “˜build’ date of the website”
totally agree with this.
meta name=”build_date” content=”28-1-2008″
@Carsten E:
“Browser-vendors can make up their own mind if they want to offer rendering for 3 previous versions or 20. But if they do 2 at least most websites won’t break, which was what this whole discussion started about.”
Using this switch you can specify a version number which means that if IE12 encounters a version switch that defines IE10, it has to render as IE10, otherwise there would be no reason to specify a version.
“No need to alter, just optional to use in sites to be build in the future.”
It won’t be optional in the end because once it is implemented IE7 will be the default mode forever! Not really an option in the long run. That’s just the whole problem with this switch: the fact that not the latest version will be the default rendering mode, but that of an old version. This can not be reset later on, other that introducing yet another switch.
Therefore this switch is not a solution to a problem, but only a temporarily cover up.
From the original Beyond Doctype article:
“In an ideal world, of course, all specifications would be perfect from the get-go, and their implementation in user agents would be immediate and flawless. In a slightly more down-to-earth version of an ideal world, browser vendors would immediately integrate regularly updated standards into new user agents—and users would have instant access to the latest version of those browsers without having to lift a finger.”
When I read that I thought “Doesn’t my antivirus software do this every day?” Each day at 2 AM it downloads an updated database of virus information and automatically installs it. Maybe the thing to do for Microsoft (and others) is to create a “standards database” that can be updated every so often automatically by the application.
In fact this could easily function as an improvement of or extension to the Firefox/IE/Opera update mechanism that alerts me that a new version is available. Just tell me that an updated “DTD entry” is available and tell me to install it. Or even better, have the browser automatically pull it down and install it.
(yes, this is pretty much a copy of another comment. The more times I say it the better chance someone will see it.)
I’ll second Keri in post 31. Let the sites break and the users upgrade their browsers.
I remember that I was expected to upgrade to Netscape 4 when a lot of sites started using frames. We’ve all learned that frames are obnoxious and not a good idea in general, but the point was in order to use the new feature I had to upgrade. I think the same thing applies now just as much as it did when all of the browsers started supporting frames. If you want to take advantage of new browser features I say you must upgrade. I think standardized rendering is a feature worth upgrading for.
Of course upgrading will only work if the “new” browsers actually adhere to a common and standard specification.
I’d also like to take this opportunity to thank Microsoft for making my job as a web developer as difficult as possible, with the likelihood of becoming impossible in the near future.
I can’t imagine that nested engines would lead to anything close to a stable product. Even assuming that somehow developers get the standard browser implementation of IEn running as IE7 to work as expected, there are still the other myriad uses of the IE engine to worry endlessly about.
Given that this type of support would almost certainly need additional overhead, how would a move like this affect the development of the browser for mobile devices, embedded systems and integrated browsers (like Help sections)? It is surely hard enough to get a browser to function under these conditions, let alone once you add increasing levels of complexity and a fixed hardware platform.
How would you suggest even approaching things like embedded systems where you know conclusively that an browser will not change versions? ‘lite’ installs with single version support?
Browser switching is probably MS’s response to losing market share to better web-standards-compliant browsers like Mozilla/Firefox.
The hooks:
1 For developers: Throw a spanner into the pot so developers need to spend even more time coding to IE’s peculiarities. They might be kept so busy they won’t have time to develop for other browsers.
2 For the public: Why ever use anything other than IE? It becomes the only browser capable of reaching backward and forward in time.
Blah. More posturing. More MS complications. More reason for web standards development to keep moving forward.
The most telling aspect of this article is that it reads more like an apology than an endorsement. It’s almost as if Eric Meyer threw up his hands in exasperation and said, “Sorry guys, I know there are a lot of problems with the idea, but this is the best we can do given the circumstances.”
As much as I respect Mr. Meyer, I’m going to have to disagree. We can and should expect better.
It’d be much more useful if Microsoft and other browsers made better resources available to web developers.. how about the latest versions of the code, e.g. http://nightly.webkit.org/
How about releasing development versions of OLD browsers that would enable developers to run multiple versions of FF, IE, whatever on the same machine for testing purposes.
How about some simplified matrices of what features have been updated, how rendering has change from browser to browser, lists of developer friendly info on fixing rendering issues etc.
Maybe if Microsoft released a developer toolbar that didn’t suck, maybe even a javascript debugger (or at least a good logger) that doesn’t require visual studio.
That’s the kind of stuff that as a developer, i would find genuinely helpful. There is plenty of room for improvement in other areas without going down this road.
The browser wars may be back if this proposal goes ahead. If so, do battle with words. Let Microsoft know we are NOT happy with this idea, which may well stagnate large parts of the web, locked into IE-only sites designed for out-of-date browser versions.
I’ve made a range of “t-shirt designs”:http://www.flickr.com/photos/christopherhester/sets/72157603847933333/ suitable for the oncoming battle. Let’s all join forces and combat this growing threat. Don’t let Microsoft control the web and hold back standards!
There are a number of good points made in the responses to this post. I have a few thoughts of my own:
My own recommendation would be to use HTML 5 as a breakthrough point. At this point in time, force the doctype to be strictly HTML 5. Everything made before this time would be HTML 4 or earlier and would have quirks. Have no “quirks” mode for HTML 5. If the markup is wrong, let the browser show a failure.
I’d also recommend the doctype for HTML 5 include the numeral “5”. Should a future standard break HTML 5 rules, browsers would know to use HTML 5. If you write a faulty program, it’s going to crash. If you don’t write web page correctly, it shouldn’t display.
I see no reason to use the meta tag to target specific browsers. Many users upgrade to the latest browsers. Some are slower to adopt than others.
With the meta tag solution, would subversions be compatible under the major number or separately numbered? IE5 / IE5.01 / IE5.5 anyone?
What I truly think would make a better solution is to have pages tagged with standard numbers. For example:
That way, the browser would know exactly what versions of each technology were coded into a page. This would also encourage page writers to think about the standards being used. Authoring tools would also become more compliant.
Got something to say?
We have turned off comments, but you can see what folks had to say before we did so.
More from ALA
Humility: An Essential Value
Personalization Pyramid: A Framework for Designing with User Data
Mobile-First CSS: Is It Time for a Rethink?
Designers, (Re)define Success First
Breaking Out of the Box