High Accessibility Is Effective Search Engine Optimization

by Andy Hagans

71 Reader Comments

Back to the Article
  1. I find it almost amazing that so much of good practices in web design come down to using common sense.
    When you try like me to create fast-loading, accessible, maintainable and findable web sites, you will find that these different aspects do not cancel each other out, but in fact the same solution applies to all and a solution for one problem benefits the other.
    This article shows how making a site accessible also benefits your findability.
    Building with the Standards in mind gives so many benefits you wonder why not everyone is doing it.
    Copy & paste the code below to embed this comment.
  2. There is way too much BS floating around these days.  SEO shouldn’t be a career - it’s just a practice of effective web developers. I would like to explore web semantics as a means to optimize search engine results, or to make those results more meaningful. What this is all about is getting information into a format that is universally understandable by machines and humans alike - across all platforms.  At work, my case for adopting microformats is beginning to be heard because I’m calling it “SEO”.  Most companies don’t care about things like device independence, handicapped accessibility, or ease-of-development - they only care about money and traffic.  Calling the adoption of web standards or applying semantics to a page “Search Engine Optimization” may well be the excuse we’ve all been looking for :)
    Copy & paste the code below to embed this comment.
  3. I had come to this conclusion myself a while ago. The W3C has accesibility validators which are quite useful for checking parts of your white hat SEO.
    Copy & paste the code below to embed this comment.
  4. #2 has a great point - if businesses start to think of incorporating accessibility and standards into their sites as something that can help make them money, they’re far more likely to go for it.
    Copy & paste the code below to embed this comment.
  5. Is perceivable that Google index accessibility sites better than others. This is other motivation to web designers/developers change yours way to create sites.
    Copy & paste the code below to embed this comment.
  6. Now I’m not one to poo-poo these kinds of articles—this is exactly what I preach to my clients day-in and day-out. But what I’ve yet to see (possibly through my own lack of metrics) is actual, feasible results. The kind of results clients will pay for when I offer them “additional SEO work for their site” (legacy sites only, of course—accessibility and CSS layout should be mandatory on all new stuff). Does anyone in this discussion have something of that nature to offer? I feel it would be a huge boost.
    Copy & paste the code below to embed this comment.
  7. I have been saying much of this to folks where I work, unfortunately I haven’t been able to find the correlative materials (other articles etc.) that could make a viable case for this. However, I had also not thought of tieing it directly into SEO. Armed with this, while I may still get a fair amount of guff for hopping on my sopa bax to preach accessibility in web design again, I know that some folks, those who really care any way, will be listening. Thanks for a great read.
    Copy & paste the code below to embed this comment.
  8. A few years ago when I was job-hunting, I was called into an interview specifically because my name had jumped to the top of the Google search results (this is no longer the case, but then that’s probably a good thing - I doubt that most people searching for Matt Robinson are looking for me!) The interviewer wanted to know how I’d done it, what the magic secret was that made Google sit up and notice me above all the other Matt Robinsons out there, and the answer was “simple, accessible, semantically marked-up HTML”. It’s not just personal experience, either. I was introduced to one of the black-hat SEO (for Google specifically) guides last month, where various people measured the effect of certain Google-defeating tricks over time, and the enduring and verified techniques were virtually all compatible with best-practice accessible, semantic site code and content design. Nearly the others faded in usefulness over time, or even became actively penalised by Googlebot, but good, clear and simple code still has my sites 2nd and 4th on an ego-search at the time of writing.
    Copy & paste the code below to embed this comment.
  9. As a professionnal SEO interested in web design and accessibility/ergonomy matters, I can testify of the validity of all things beings said in this article. I even wrote some times ago an article very similar to this one (in french) : http://s.billard.free.fr/referencement/index.php?2005/01/13/3-article-referencement-et-accessibilite By respecting standard and accessibility guidelines, you are sure to remove all barriers that could block spiders and presenting information in a well structured and semantically meaningful way.
    Copy & paste the code below to embed this comment.
  10. I am currently redesigning a web site which sole purpose is to showcase photos that I take. I started using XHTML/CSS/Accessibility just to get a feeling of what these technologies have to offer. The main content of this site are photographs. Apart from using properly the alt and longdesc tags, and including pertinent captions for every photograph, is there anything else I can do to improve the site’s future listing in search engines?
    Copy & paste the code below to embed this comment.
  11. G Guzi, A good way to improve *any* site’s search engine listings is to get links—quality, relevant inbound links. Without them, your site is just an “island” to Google, no matter how accessible it is.
    Copy & paste the code below to embed this comment.
  12. Yes, it’s common sense, yes I and countless others have been preaching this for quite a while to our clients, but now we have an actual well-written article on a respected site with links to other respected articles that we can point our clients to when they are asking how clean, semantic markup can improve their search rankings. Thanks for that!
    Copy & paste the code below to embed this comment.
  13. I’ve always thought this, accessibility and web standards do go hand in hand with SEO. A search engine bot is almost like a person with a disability because it can’t see or hear. Here’s a link to a website that displays the text as search engine bot would see it. http://www.seo-browser.com/
    Copy & paste the code below to embed this comment.
  14. i always considered guideline 1.2 to be obsolete. who nowadays is still using server-side image maps? they’re completely inaccessible to keyboard users, hence the need for 1.2, and rely on convoluted server-side CGI or similar to determine which region was actually activated based on the X and Y coordinates of the mouse click…
    now, client side image maps, fine. there, 1.1 applies: you should provide alternative text for each AREA.
    Copy & paste the code below to embed this comment.
  15. 3.5 Use header elements to convey document structure and use them according to specification. Some (most?) search engines will give more weight to headers. Write good header text, which ideally includes some of your keywords (as long as it’s in a natural way…not just keyword stuffing everything in an H1) 6.5 Ensure that dynamic content is accessible or provide an alternative presentation or page. If you generate large chunks of important content or navigation via something like javascript, search engines won’t be able to see it, index it, or follow any links that were created. 7.5 Until user agents provide the ability to stop auto-redirect, do not use markup to redirect pages automatically. Instead, configure the server to perform redirects. Spider may ignore any meta refresh or javascript based redirection. Redirecting on the server is just the most transparent method for all. 13.1 Clearly identify the target of each link. Meaning: write good, clear link text (which, as the article mentions, is important). It’s not good enough to have lots of “click here” links… 4.2 Specify the expansion of each abbreviation or acronym in a document where it first occurs.
    5.5 Provide summaries for tables. Other legit ways to get some more keywords on your page, while helping users understand your content.
    Oh, and related to my previous comment: yes, there is an equivalent of 1.2 for client-side image maps as well “1.5 Until user agents render text equivalents for client-side image map links, provide redundant text links for each active region of a client-side image map” ... however I’d argue that we’ve now come to a situation where the “until user agents” part is fulfilled by the majority of browsers currently in ciruclation.
    Copy & paste the code below to embed this comment.
  16. Patrick, Agree with you that that guideline is a bit out of date. The principle stands though: always give text equivalents!
    Copy & paste the code below to embed this comment.
  17. I did a little experiment in setting up a client-side image map and used *alt* text for all areas but could not get any kind of link listing or alt text to appear when I turned of images in Firefox, Opera and OmniWeb. Using Fangs screen-reader emulator came up with nothing too. Nor did using a web-based Lynx emulator. Do modern browsers, or at least ADA-capable browsers allow access to client-side image maps, or does one need to explicitly replicate the links with text-only links?
    Copy & paste the code below to embed this comment.
  18. Great article - I wrote a similar article not too long ago. One problem I often experience with SEO is a matter of who is writing the content. Most of my clients’ sites are CMS-based, so most of the content is produced by the clients. No matter how much I try to educate the clients, they never seem to get it right when it comes to writing SEO-texts. Even the most basic use of descriptive links is forgotten. I can’t count the times, I’ve told my clients not to use the “click here” links. Any suggestions on how to make clients more focused on SEO?
    Copy & paste the code below to embed this comment.
  19. ‘Search engines are also “deaf”? in reference to audio files. Again, providing textual descriptions to these files allows search engines to better interpret and rank the content’: No, what you need are caption files to index, which is massively more difficult and also exceedingly rare. Also, shouldn’t you have talked about Flash?
    Copy & paste the code below to embed this comment.
  20. I can’t squeeze everything about it into a single article! Besides, I heard Flash was the subject of *your* upcoming ALA article ;-)
    Copy & paste the code below to embed this comment.
  21. Isn’t it in Zeldman’s book where Google is called the richest blind web surfer?  Whoever I’m stealing that line from, it’s been pretty useful.  Nothing modivates clients better than talking about all that money they aren’t making.
    Copy & paste the code below to embed this comment.
  22. Why isn’t comment #20 marked as an author comment?
    Copy & paste the code below to embed this comment.
  23. The deeper I dig into CSS/XHTML the more SEO issues seem to be melting away. It is nice to see this idea getting more attention.
    Copy & paste the code below to embed this comment.
  24. Repetition forms a statement? Call me daft, but I have the feeling I’ve read all the arguments before in different articles on A List Apart. Obviously it benefits people, looking at the discussion, but I feel this is all old hat and that A List Apart is moving away from the cutting edge to main stream; which is a shame in my opinion.
    Copy & paste the code below to embed this comment.
  25. Martijn, it may be repeating arguments, but it’s also “reafirming” the idea, and from a different point of view.
    Copy & paste the code below to embed this comment.
  26. bq. I did a little experiment in setting up a client-side image map and used alt text for all areas but could not get any kind of link listing or alt text to appear when I turned of images in Firefox, Opera and OmniWeb. Using Fangs screen-reader emulator came up with nothing too. Nor did using a web-based Lynx emulator. There are a couple of points here.
    In Opera, Ctrl+J brings up a list of all links. I would have thought Firefox would have a similar feature. It is important to provide “redundant” text links for a client side image map, for a number of reasons.
    - Some users might not realise that the map is clickable.
    - Some users might have images turned off.
    - some users might have motor trouble that means they have difficulty positioning the cursor accurately.
    - Some users might have impaired vision, and can read text (which they display at a large size) but can not see images as clearly. Generally, the best solution to a client-side image map is to repeat the links underneath as plain text - that way, it is clear to everybody how they can get to the page they want.
    Copy & paste the code below to embed this comment.
  27. Brad, No qualifiable studies have been performed as far as I know; it’s all empirical evidence.
    Copy & paste the code below to embed this comment.
  28. I’d argue that *all* SEO evidence is empirical. If a client whines, why not show them Google’s guidelines?
    Copy & paste the code below to embed this comment.
  29. bq. There are a couple of points here. In Opera, Ctrl+J brings up a list of all links. I would have thought Firefox would have a similar feature. Seems as though Opera doesn’t show the list of links contained in a client-side image map either because I got an empty list with my test. I have a validating 4.01 page with no more content than an image with a map that uses both alt and title attributes for each area. The href attributes all point to relative liks. Funny part is, with Firefox, it _does_ show the links when you get info on the page, but it’s not in a useable format (i.e., it’s not clickable). In every instance I have available to me to test this out, it appears that any image-map, _including_ client-side (not specified by the article’s recommendations) are inheirently inaccessible. For all the reasons that Stephen Down gives, plus the fact that they seem completely unviewable (do search-engines’ bots even read them then?) in browsers without user physical interaction. *Salt in the wound*: IE 5.2.3 _does_ make the links accessible (well, not really in a WAG sense of it) if I add the page to “Page Holder” and then click the “Links” button. Ouch.
    Copy & paste the code below to embed this comment.
  30. Add that among the Mac-based browsers, IE 5 appears to be the only one that highlights the imagemap’s polygons as I tab-navigate the page. For as buggy and “dead” as IE 5 is, it still has some features that have yet to be matched by the modern browsers. I personally miss the way the embedded web archive tool worked, and the aforementioned Page Holder and Link functionality”¦ But I digress.
    Copy & paste the code below to embed this comment.
  31. I agree and have been preaching the same however Search Engine Optimisation is more than just on page optimisation. There is inbound links and pay per click etc”¦.
    Copy & paste the code below to embed this comment.
  32. Accessibility can be sold including the benefits of doing an accessible design, one of which is better search ranking.
    Copy & paste the code below to embed this comment.
  33. Maybe it’s just because I work in such a small town, but usually all our clients want to know is “Will this improve my Google ranking?” With this article as ammo, I can confidently tell them “Yes.” (That’s all they care to hear anyway. If we told them that pictures of monkeys would improve their Google ranking, they’d be all in.) My first site, one I started when I was 16, got to the top of Google’s ranking for many key search terms. It’s littered with poor markup, fancy JavaScripting, and worst of all, it’s all controlled by tables. (I use to think using DIVs anywhere was amateur..) It got to the top of the rankings because it was actually relevant to the key search terms, and was updated frequently. So, I don’t know how effective accessibility is compared to links and such, but it’s certainly something I will sell to my clients.
    Copy & paste the code below to embed this comment.
  34. For ease of discussing web development is often easier categorize the field into neat theoretical elements such as ‘Accessibility’, ‘SEO’, ‘Usability’ or ‘Design’, but it is important to stop and realize these are simply labels. In reality when designing a web site all of this principles come into play at the same time. A simple example is that the writing of a <h1> tag ticks-off three, if not all, of the quoted paradigms. Mr. Hagans’s points are valid, and eruditely written; he is at the sharp end of the SEO field and I know by observation that he knows what he is talking about. What he is saying however, is not new or radical, hence why so many here make the point that his argument is commonsensical. What the author has done well is highlight the nature of the natural overlaps that exists in the real world of web design.
    Copy & paste the code below to embed this comment.
  35. I had just decided to put my portfolio website together at long last, and I was already keen on creating a highly accessible website.  A week later and I’m top of the google ranking when you search under my name… of course I will probably be off the top by the time some of you read this!  Its great to see that following web standards is finally beginning to stand up for itself.
    Copy & paste the code below to embed this comment.
  36. Excellent read. We used this approach a year back and our website was brining in 80% of our web business. Simple, honest and common-sense approach to building and marketing a website.
    Copy & paste the code below to embed this comment.
  37. The entire topic of search engine optimisation ‘standards’ and ‘best practice’ should always be prefixed “in an ideal world…” because the fact is that YES, accessability would help search engine marketability…in an ideal world - the fact is that theres too much grey/black hat seo going on for it to make a dent - im also annoyed that seo idealists always forget that 99% of the business net never gets updates despite a designers best efforts due to lack of client investment, so having sites that ‘have quality content’ completeley discriminates against small business who only have 10ish pages (including privacy, t&c, contact information etc) *end or rant*
    Copy & paste the code below to embed this comment.
  38. Other Great Resources for “Real Life” Search Engines Policies can be found here: MSN Search http://channel9.msdn.com/ShowPost.aspx?PostID=132207 Google Search http://www.mattcutts.com/blog/type/googleseo/
    Copy & paste the code below to embed this comment.
  39. The latest buzz word “AJAX” is the problem on the horizon. I know of many browser based software applications that are headed in that direction. This just doesn’t seem to bode well for accessibility in general. The information presented was in deed helpful. Thank you.
    Copy & paste the code below to embed this comment.
  40. A far better way to have imagemaps is to just have a well-styled list of links.
    See ALA article: http://alistapart.com/articles/sprites
    Real-world example: http://www.seeda.co.uk/ (map in LH col has a simple <ul> behind it)
    Copy & paste the code below to embed this comment.
  41. A month on from a client re-design, and we’ve been tracking Google’s love for the site. We switched from an old table-based site to a brand-spanking new CSS layout. The results are amazing (although it may shift again - it’s only been a month). Went from greater than 300 to number 1 for a few general words and lots of obscure ones. So those are the actual results I wanted to see, and they’re just awesome.
    Copy & paste the code below to embed this comment.
  42. *test*
    Copy & paste the code below to embed this comment.
  43. I’ve been swearing by less code, more content for a while now. It’s truly amazing what can happen when you have easy traversable urls, keep all your design in a stylesheet and just stick to having lots and lots of content on your pages. I’m in the process of spinning up my personal site results just to prove that spending thousands of dollars with one of these fly by night SEO companies is rediculous. Just make it clean and valuable, you’ll get all the ranking you want.
    Copy & paste the code below to embed this comment.
  44. Obtaining top rankings by unethical tricks by some seos are causing greta problems. Sometimes it gets some good resulsts also. But it will not last for long. Whether search engines finds a solution to block all these tricks, it would be a great thing. We use only ethical things to promote websites. All the steps in optimizing is done with extra care by studying the search engine strategies. You can get more details from here http://wwww.seowebsolution.com
    Copy & paste the code below to embed this comment.
  45. I went through your article “High Accessibility Is Effective Search Engine Optimization”, and found it quite interesting. As far as Google is concerned ,it has recently announced Google XML SiteMaps. Google is encouraging each and every webmaster around the world to add special XML file named sitemap.xml on its web domain. This XML file consits of several tags like 1)<URL>
    2)<loc> - URL of a webpage
    3) - (0.1 to 1.0 of a web page)
    3)<lastmod>  - (Last date on which a webpage was modified
    4)<changefreq> - (monthly,weekly, or yearly of a specific page) You can visit http://www.sitemapdoc.com to create a FREE XML SITEMAP for your website. Its absolutely free of cost.
    Copy & paste the code below to embed this comment.
  46. Hi! Thank you for these valuable informations…
    Greetings from Germany Sam.
    Copy & paste the code below to embed this comment.
  47. Great article by Andy, as always. From experience as an SEO if the visitor is able to use the website with ease this is one of the goals of Google. If the website is spiderable (good design and navigation) the search engine robots can index the web pages. What is good for the visitor is also be good for the search engine robots in terms of search engine marketing success.
    Copy & paste the code below to embed this comment.
  48. This is my first comment here. When I read this article I want to write a comment. I am intrested in SEO, and I try to learn it. From some SEO experts I learned something. But not many SEO experts talk about Web Standard, they just talk about many tricks. Later I know css/xhtml is the big “tricks” of SEO. So I got start to learn this.Aand love it so much. Really good reading.
    Copy & paste the code below to embed this comment.
  49. Thanks for a good read. It is pretty much common sense to follow accessibility guidelines for effective SEO. No one is ever going to know the Google algorithm, yet building a site to standards is definitely going to put you in a position where your on page optimisation is effective. What makes the difference with competitive search terms is the off page optimisation.
    Copy & paste the code below to embed this comment.
  50. Nothing here about URL structure and its impact on accessibility.  Anyone found some recent research about parameter based URLs and to what degree engines are crawling question marks, ampersands, etc.?
    Copy & paste the code below to embed this comment.
  51. Derek “asked”:http://alistapart.com/comments/accessibilityseo?page=3#22 : bq. Isn’t it in Zeldman’s book where Google is called the richest blind web surfer?
    Yes. Yes “it is”:http://books.google.com/books?q=blind+billionaire .
    Copy & paste the code below to embed this comment.
  52. I’d like to index my page with the Google sitemap, but using SSIs for the <head> and first part of <body> of all my pages prevents me from creating a new document title for each page, resulting in a repetitive index. I think there’s got to be a better templating method than using SSIs in this way. Please friends, enlighten me.
    Copy & paste the code below to embed this comment.
  53. Here is one page,,,, <!—#include file=“head1.html”—>
    <title>Title - Page 1</title>
    <!—#include file=“head2.html”—>
    <div>
    <h1>Page 1</h1> Lots of stuff in here. </div></body></html> Here is another page,,, <!—#include file=“head1.html”—>
    <title>Title - Page 2</title>
    <!—#include file=“head2.html”—>
    <div>
    <h1>Page 2</h1> Lots more stuff in here. </div></body></html> Different titles. Multiple (two in this case) includes in the same page. Kludgy? Maybe. Does it work? You bet.
    Copy & paste the code below to embed this comment.
  54. With reference to Max’s comment: After all these years, content has managed to keep it’s royal status and most probably for eternity.
    Copy & paste the code below to embed this comment.
  55. By meeting the Accessibility guidelines, you not only provide disabled people with access to your site, you can provide keyword rich Alt tags that can be indexed by search engines.  These can be especially beneficial in image searches. If you are in the travel game, you will probably want to make sure you are using the Alt tags.
    Copy & paste the code below to embed this comment.
  56. what hagan has said is unflawingly true. Accessibilty is the king in the seo rules, as itelf pointed out by google chaps. But it does not really succeed as a single honest tool. I have seen spammed pages get higher rank in google search, and really aceesible pages suffer blackoout swarms. Be accessible first, and then a little mischevious. This is the real moto in seo, as far as sucess goes.
    Copy & paste the code below to embed this comment.
  57. It would really be good if all designers start following W3 rules.
    XHTML designs are really better for good Search Engine Placement.
    Copy & paste the code below to embed this comment.
  58. I agree with your viewpoints. Even there is a considerable sized body of practitioners of SEO who see search engines as just another visitor to a site, and try to make the site as accessible to those visitors as to any other who would come to the pages.
    They often see the white hat/black hat dichotomy mentioned above as a false dilemma. The focus of their work is not primarily to rank the highest for certain terms in search engines, but rather to help site owners fulfill the business objectives of their sites. Indeed, ranking well for a few terms among the many possibilities does not guarantee more sales.
    A successful Internet marketing campaign may drive organic search results to pages, but it also may involve the use of paid advertising on search engines and other pages, building high quality web pages to engage and persuade, addressing technical issues that may keep search engines from crawling and indexing those sites, setting up analytics programs to enable site owners to measure their successes, and making sites accessible and usable.
    Still you really did present a great SEO working critireion
    Copy & paste the code below to embed this comment.
  59. Now the company I work for here India (WDC) can be
    found on the www if we use search words that describes our normal line of activities.But here you can’t work with new ideas. You just have to follow the route with single ‘yes’ all the time. Still my efforts do make a lot of difference as an SEO optmizer. Even our main competitors are frightened at the rate of our acceleration.
    I am always the believer of hign accessibilty. Now You can even find us if you search for our straight name in the google on the front page. I love to be a flash programmer soon but still I think my SEO efforts are also extra ordinary. May be our vice president got taken it very serious if it ended up on on his laptop.
    Copy & paste the code below to embed this comment.
  60. Interesting how this article passes the test of time. IMHO, so does table based HTML. I have always had great success with table based HTML 4.01 whether it validated or not. I guess since I started designing way back in the 90’s (a dogs age in internet time), I am inclined to prefer older code that has been tweaked over time. I have never felt that standards made any difference in rankings. I will say the clean code is always the best code and that cleaning a site can effect rankings, so one aspect here could be accurate. I suppose this is all beauty in the eye of the beholder.
    :)
    Copy & paste the code below to embed this comment.
  61. I need some proof, lets have some rock solid proof. With that I can do wonders!!
    Copy & paste the code below to embed this comment.
  62. Effective Search Engine Optimization is SEO without SEO-Tricks, see Google’s guidelines.
    Your competitor is watching you.
    Copy & paste the code below to embed this comment.
  63. My company is in the process of reconstructing its Navigation Generation tool and unfortunately they developed a tool that generates the navigation as DHTML using bloated Javascript calls.  If I could have been in the design phase, I would have tried to convince them to do something like suckerfish (manipulating an unorderd list with CSS).  Now that I’m stuck with this, I’m wondering if there is something I can do in paralelle with this menu to offer something SEO friendly and perhaps even accessable too.  I was thinking maybe to have a page not only render the regular nav using javascript but also simultaneously render the same menu in a hidden DIV layer as an list for search engines to spider.  This scares me because I don’t want to get the infamous BMW ban for having content hidden from human eyes.  Does anyone have any suggestions?
    Copy & paste the code below to embed this comment.
  64. Ron, Don’t go down the hidden DIV layer, you are likely to get the site banned by one of your competitors reporting hidden text. Your best bet would be to redesign using suckerfish and kill the javascript altogether.
    Copy & paste the code below to embed this comment.
  65. White SEO - good seo.
    Copy & paste the code below to embed this comment.
  66. It is true, as I have read many such articles, that certain people regard search engine optimization as a dirty trick. In a way they are not far off the mark, as white hat as it may be, all sorts of tricks and procedures are needed to successfully compete in search engine space. But it is not our fault. We inherited the system from the search engines, not the other way round, and it is these systems that we fiddle around with to find the necessary information about the algorithms before manipulating them. What’s so bad about that?
    Copy & paste the code below to embed this comment.
  67. I was disappointed to hear during a Google webinar event on 22 Oct: ‘Webmaster Chat - “Tips and Treats”’ that Google places no ranking benefit on a page that is well marked up versus one that is not. They made the point that great content put up by someone who did not know how to semantically markup that content should not be penalised. For me, this answer given by Google feels credible but at the same time I am disappointed to hear that one of the many benefits of web standards, that it improves SE ranking, is perhaps not true. I will always build to web standards for all the many good reasons, but perhaps I need to temper my enthusiasm for that place in the venn diagram of life where web standards and SEO meet?
    Copy & paste the code below to embed this comment.
  68. In recent google webinar and above comment by Alan Google places no ranking benefit on a page that is well marked up versus one that is not. I think John and Matt where talking about “strict markup” not markup in general. I am sure the information indexed has to be parsed based on mark up and when done properly there would be benefit? Now I am confused…
    Copy & paste the code below to embed this comment.
  69. I (and I think Mike above) would be delighted if a venerable ALA staffer would give their view on this (#65 and #66). Thanks in advance for any additional comments on this. Cheers, -Alan
    Copy & paste the code below to embed this comment.
  70. This article is a great way to show the parallel bewtween good practices and direct benefits for your business.
    I’ve found out that it is really difficult to sell ” web standards” to customers.  Nobody (almost) seems to care.  The proof of this is that a super high percentage of sites is not standards compliant, and this is not because of lack of resources as I point out in a post on Web Standards and Fortune 500 Companies:
    http://www.aggiorno.com/blog/post/Benchmarking-DOCTYPE-validation-in-Fortune-500s-Web-sites.aspx On the other hand, if you tender to the benefits of web standards like accessibility, SEO, redecution in maintenance costs, better chance to display correctly on mobile browsers, etc, you do get some traction in the conversation. In any case, I believe Google does care about well written code and I summarize a number of examples in this post: http://www.aggiorno.com/blog/post/Web-Standards-and-Search-Engine-Optimization-(SEO)-Does-Google-care-about-the-quality-of-your-markup.aspx
    Copy & paste the code below to embed this comment.
  71. It’s not just personal experience, either. I was introduced to one of the black-hat SEO (for Google specifically) guides last month, where various people measured the effect of certain Google-defeating tricks over time, and the enduring and verified techniques were virtually all compatible with best-practice accessible, semantic site code and content design. Nearly the others faded in usefulness over time, or even became actively penalised by Googlebot, but good, clear and simple code still has my sites
    sohbet 2nd and 4th on an ego-search at the time of writing
    Copy & paste the code below to embed this comment.