High Accessibility Is Effective Search Engine Optimization
Issue № 207

High Accessibility Is Effective Search Engine Optimization

Many web designers view search-engine optimization (SEO) as a “dirty trick,” and with good reason: search engine optimizers often pollute search engine results with spam, making it harder to find relevant information when searching. But in fact, there is more than one type of search-engine optimization. In common usage, “black-hat” SEO seeks to achieve high rankings in search engines by any means possible, whereas “white-hat” SEO seeks to code web pages in a way that is friendly to search engines.

Article Continues Below

In Using XHTML/CSS for an Effective SEO Campaign, Brandon Olejniczak explains that many web design best practices overlap with those of white-hat SEO. The reason is simple: such practices as separating style from content, minimizing obtrusive JavaScript, and streamlining code allow search engines to more easily spider, index, and rank web pages.

Two years later, I am going to take Brandon’s conclusions a step further. I have been a search engine optimizer for several years, but only recently have become infatuated with web accessibility. After reading for weeks and painstakingly editing my personal website to comply with most W3C Web Content Accessibility Guidelines, I have come to a startling revelation: high accessibility overlaps heavily with effective white hat SEO.

Accessibility for all users, even search engines#section2

On further reflection, this overlap makes sense. The goal of accessibility is to make web content accessible to as many people as possible, including those who experience that content under technical, physical, or other constraints. It may be useful to think of search engines as users with substantial constraints: they can’t read text in images, can’t interpret JavaScript or applets, and can’t “view” many other kinds of multimedia content. These are the types of problems that accessibility is supposed to solve in the first place.

Walking through a few checkpoints#section3

Now that I’ve discussed the theory of why high accessibility overlaps with effective SEO, I will show how it does so. To do this, I am going to touch upon each Priority 1 checkpoint in the W3C Web Content Accessibility Guidelines which affects search-engine optimization.

1.1 Provide a text equivalent for every non-text element (e.g., via “alt”, “longdesc”, or in element content)…

Not only are search engines unable to understand image and movie files, they also cannot interpret any textual content that is based on vision (such as ASCII art). alt and longdesc attributes will, therefore, help them understand the subject of any such content.

Search engines are also “deaf” in reference to audio files. Again, providing textual descriptions to these files allows search engines to better interpret and rank the content that they cannot “hear.”

1.2 Provide redundant text links for each active region of a server-side image map.

Text links are very important to search engines, since anchor text often succinctly labels the content of a link’s target page. In fact, many search engine optimizers consider anchor text to be the single most important factor in modern search algorithms. If a website uses an image map rather than a text-based menu as the primary navigational method, a redundant text-only menu elsewhere on the page will give search engines additional information about the content of each target page.

4.1 Clearly identify changes in the natural language of a document’s text and any text equivalents (e.g., captions).

Major search engines maintain country and language-specific indexes. Specifying the language of a document (or of text within a document) helps search engines decide in which index(es) to place it.

6.3 Ensure that pages are usable when scripts, applets, or other programmatic objects are turned off or not supported […]

Some users choose to disable JavaScript and applets in their browser’s preferences, while other users’ browsers do not support these technologies at all. Likewise, search engines’ “browsers” do not read scripts; therefore a webpage’s usability should not be crippled when scripts are not supported. Otherwise, search engines may not even index the page, let alone rank it well.

14.1 Use the clearest and simplest language appropriate for a site’s content.

It is a bit less obvious how this particular checkpoint aids SEO. But if a website contains the “clearest and simplest language appropriate for the site’s content,” it is probably using those keywords with which potential searchers will be most familiar. Searchers tend to use succinct queries containing familiar language. Thus, to receive maximum traffic from search engines, it is best that a website contain the same words which the site’s audience will use when searching.

The benefits do not end with Priority 1—many of the Priority 2 and 3 Checkpoints are important for SEO purposes, too. For instance, Checkpoints 6.2 and 6.5 refer to the accessibility of dynamic content. In fact, making dynamic content search engine-friendly is one of the most daunting tasks a search engine optimizer faces when working on an ecommerce or database-driven site. Following the W3C’s recommendations can help to avoid any indexing or ranking problems related to using dynamic content.

From the horse’s mouth#section4

If you doubt any of the above, perhaps a visit to Google’s Webmaster Guidelines could convince you that Google rewards high accessibility. This page specifically mentions best practices which will help Google “find, index, and rank your site.”

Design and Content Guidelines:#section5

  • Make a site with a clear hierarchy and text links. Every page should be reachable from at least one static text link.
  • Offer a site map to your users with links that point to the important parts of your site. If the site map is larger than 100 or so links, you may want to break the site map into separate pages.
  • Create a useful, information-rich site, and write pages that clearly and accurately describe your content.
  • Think about the words users would type to find your pages, and make sure that your site actually includes those words within it.
  • Try to use text instead of images to display important names, content, or links. The Google crawler doesn’t recognize text contained in images.
  • Make sure that your title and alt tags are descriptive and accurate. […]

Technical Guidelines:#section6

  • Use a text browser such as Lynx to examine your site, because most search engine spiders see your site much as Lynx would. If fancy features such as JavaScript, cookies, session IDs, frames, DHTML, or Flash keep you from seeing all of your site in a text browser, then search engine spiders may have trouble crawling your site.

Note that each of Google’s guidelines actually correlates with a W3C Web Content Accessibility Guideline. (Oddly enough, the word “accessibility” does not actually appear in Google’s Webmaster Guidelines. Perhaps they are afraid of scaring off some webmasters with technical jargon? In any case, it is clear that Google is lobbying for high accessibility.)

SEO: just another feather in accessibility’s cap#section7

The checkpoints I highlighted above are just a few of the many ways that high accessibility will help optimize a website for search engines—many of the other checkpoints in the W3C Web Content Accessibility Guidelines are helpful to SEO, as well. Of course, to most web designers, the goal of accessibility is (and should be) to make sites accessible to all people, independent of their platform or any disabilities they have. But if accessibility gets a website more traffic from Google, even better!

The good news is that a web designer who follows best practices for accessibility is already practicing solid white hat SEO. Search engines need not scare anyone. When in doubt, design your site to be accessible to blind and deaf users as well as those who view websites via text-only browsers, and SEO will fall into place.

71 Reader Comments

  1. I find it almost amazing that so much of good practices in web design come down to using common sense.
    When you try like me to create fast-loading, accessible, maintainable and findable web sites, you will find that these different aspects do not cancel each other out, but in fact the same solution applies to all and a solution for one problem benefits the other.
    This article shows how making a site accessible also benefits your findability.
    Building with the Standards in mind gives so many benefits you wonder why not everyone is doing it.

  2. There is way too much BS floating around these days. SEO shouldn’t be a career – it’s just a practice of effective web developers.

    I would like to explore web semantics as a means to optimize search engine results, or to make those results more meaningful.

    What this is all about is getting information into a format that is universally understandable by machines and humans alike – across all platforms.

    At work, my case for adopting microformats is beginning to be heard because I’m calling it “SEO”. Most companies don’t care about things like device independence, handicapped accessibility, or ease-of-development – they only care about money and traffic. Calling the adoption of web standards or applying semantics to a page “Search Engine Optimization” may well be the excuse we’ve all been looking for 🙂

  3. I had come to this conclusion myself a while ago.

    The W3C has accesibility validators which are quite useful for checking parts of your white hat SEO.

  4. #2 has a great point – if businesses start to think of incorporating accessibility and standards into their sites as something that can help make them money, they’re far more likely to go for it.

  5. Is perceivable that Google index accessibility sites better than others. This is other motivation to web designers/developers change yours way to create sites.

  6. Now I’m not one to poo-poo these kinds of articles–this is exactly what I preach to my clients day-in and day-out.

    But what I’ve yet to see (possibly through my own lack of metrics) is actual, feasible results. The kind of results clients will pay for when I offer them “additional SEO work for their site” (legacy sites only, of course–accessibility and CSS layout should be mandatory on all new stuff).

    Does anyone in this discussion have something of that nature to offer? I feel it would be a huge boost.

  7. I have been saying much of this to folks where I work, unfortunately I haven’t been able to find the correlative materials (other articles etc.) that could make a viable case for this. However, I had also not thought of tieing it directly into SEO.

    Armed with this, while I may still get a fair amount of guff for hopping on my sopa bax to preach accessibility in web design again, I know that some folks, those who really care any way, will be listening.

    Thanks for a great read.

  8. A few years ago when I was job-hunting, I was called into an interview specifically because my name had jumped to the top of the Google search results (this is no longer the case, but then that’s probably a good thing – I doubt that most people searching for Matt Robinson are looking for me!) The interviewer wanted to know how I’d done it, what the magic secret was that made Google sit up and notice me above all the other Matt Robinsons out there, and the answer was “simple, accessible, semantically marked-up HTML”.

    It’s not just personal experience, either. I was introduced to one of the black-hat SEO (for Google specifically) guides last month, where various people measured the effect of certain Google-defeating tricks over time, and the enduring and verified techniques were virtually all compatible with best-practice accessible, semantic site code and content design. Nearly the others faded in usefulness over time, or even became actively penalised by Googlebot, but good, clear and simple code still has my sites 2nd and 4th on an ego-search at the time of writing.

  9. As a professionnal SEO interested in web design and accessibility/ergonomy matters, I can testify of the validity of all things beings said in this article.

    I even wrote some times ago an article very similar to this one (in french) : http://s.billard.free.fr/referencement/index.php?2005/01/13/3-article-referencement-et-accessibilite

    By respecting standard and accessibility guidelines, you are sure to remove all barriers that could block spiders and presenting information in a well structured and semantically meaningful way.

  10. I am currently redesigning a web site which sole purpose is to showcase photos that I take. I started using XHTML/CSS/Accessibility just to get a feeling of what these technologies have to offer.

    The main content of this site are photographs. Apart from using properly the alt and longdesc tags, and including pertinent captions for every photograph, is there anything else I can do to improve the site’s future listing in search engines?

  11. G Guzi,

    A good way to improve *any* site’s search engine listings is to get links–quality, relevant inbound links. Without them, your site is just an “island” to Google, no matter how accessible it is.

  12. Yes, it’s common sense, yes I and countless others have been preaching this for quite a while to our clients, but now we have an actual well-written article on a respected site with links to other respected articles that we can point our clients to when they are asking how clean, semantic markup can improve their search rankings.

    Thanks for that!

  13. I’ve always thought this, accessibility and web standards do go hand in hand with SEO. A search engine bot is almost like a person with a disability because it can’t see or hear.

    Here’s a link to a website that displays the text as search engine bot would see it.

    http://www.seo-browser.com/

  14. i always considered guideline 1.2 to be obsolete. who nowadays is still using server-side image maps? they’re completely inaccessible to keyboard users, hence the need for 1.2, and rely on convoluted server-side CGI or similar to determine which region was actually activated based on the X and Y coordinates of the mouse click…
    now, client side image maps, fine. there, 1.1 applies: you should provide alternative text for each AREA.

  15. 3.5 Use header elements to convey document structure and use them according to specification.

    Some (most?) search engines will give more weight to headers. Write good header text, which ideally includes some of your keywords (as long as it’s in a natural way…not just keyword stuffing everything in an H1)

    6.5 Ensure that dynamic content is accessible or provide an alternative presentation or page.

    If you generate large chunks of important content or navigation via something like javascript, search engines won’t be able to see it, index it, or follow any links that were created.

    7.5 Until user agents provide the ability to stop auto-redirect, do not use markup to redirect pages automatically. Instead, configure the server to perform redirects.

    Spider may ignore any meta refresh or javascript based redirection. Redirecting on the server is just the most transparent method for all.

    13.1 Clearly identify the target of each link.

    Meaning: write good, clear link text (which, as the article mentions, is important). It’s not good enough to have lots of “click here” links…

    4.2 Specify the expansion of each abbreviation or acronym in a document where it first occurs.
    5.5 Provide summaries for tables.

    Other legit ways to get some more keywords on your page, while helping users understand your content.

    Oh, and related to my previous comment: yes, there is an equivalent of 1.2 for client-side image maps as well “1.5 Until user agents render text equivalents for client-side image map links, provide redundant text links for each active region of a client-side image map” … however I’d argue that we’ve now come to a situation where the “until user agents” part is fulfilled by the majority of browsers currently in ciruclation.

  16. I did a little experiment in setting up a client-side image map and used *alt* text for all areas but could not get any kind of link listing or alt text to appear when I turned of images in Firefox, Opera and OmniWeb. Using Fangs screen-reader emulator came up with nothing too. Nor did using a web-based Lynx emulator.

    Do modern browsers, or at least ADA-capable browsers allow access to client-side image maps, or does one need to explicitly replicate the links with text-only links?

  17. Great article – I wrote a similar article not too long ago. One problem I often experience with SEO is a matter of who is writing the content. Most of my clients’ sites are CMS-based, so most of the content is produced by the clients. No matter how much I try to educate the clients, they never seem to get it right when it comes to writing SEO-texts. Even the most basic use of descriptive links is forgotten. I can’t count the times, I’ve told my clients not to use the “click here” links.

    Any suggestions on how to make clients more focused on SEO?

  18. ‘Search engines are also “deaf”? in reference to audio files. Again, providing textual descriptions to these files allows search engines to better interpret and rank the content’: No, what you need are caption files to index, which is massively more difficult and also exceedingly rare.

    Also, shouldn’t you have talked about Flash?

  19. Isn’t it in Zeldman’s book where Google is called the richest blind web surfer? Whoever I’m stealing that line from, it’s been pretty useful. Nothing modivates clients better than talking about all that money they aren’t making.

  20. Repetition forms a statement? Call me daft, but I have the feeling I’ve read all the arguments before in different articles on A List Apart.

    Obviously it benefits people, looking at the discussion, but I feel this is all old hat and that A List Apart is moving away from the cutting edge to main stream; which is a shame in my opinion.

  21. bq. I did a little experiment in setting up a client-side image map and used alt text for all areas but could not get any kind of link listing or alt text to appear when I turned of images in Firefox, Opera and OmniWeb. Using Fangs screen-reader emulator came up with nothing too. Nor did using a web-based Lynx emulator.

    There are a couple of points here.
    In Opera, Ctrl+J brings up a list of all links. I would have thought Firefox would have a similar feature.

    It is important to provide “redundant” text links for a client side image map, for a number of reasons.
    – Some users might not realise that the map is clickable.
    – Some users might have images turned off.
    – some users might have motor trouble that means they have difficulty positioning the cursor accurately.
    – Some users might have impaired vision, and can read text (which they display at a large size) but can not see images as clearly.

    Generally, the best solution to a client-side image map is to repeat the links underneath as plain text – that way, it is clear to everybody how they can get to the page they want.

  22. bq. There are a couple of points here. In Opera, Ctrl+J brings up a list of all links. I would have thought Firefox would have a similar feature.

    Seems as though Opera doesn’t show the list of links contained in a client-side image map either because I got an empty list with my test. I have a validating 4.01 page with no more content than an image with a map that uses both alt and title attributes for each area. The href attributes all point to relative liks.

    Funny part is, with Firefox, it _does_ show the links when you get info on the page, but it’s not in a useable format (i.e., it’s not clickable). In every instance I have available to me to test this out, it appears that any image-map, _including_ client-side (not specified by the article’s recommendations) are inheirently inaccessible.

    For all the reasons that Stephen Down gives, plus the fact that they seem completely unviewable (do search-engines’ bots even read them then?) in browsers without user physical interaction.

    *Salt in the wound*: IE 5.2.3 _does_ make the links accessible (well, not really in a WAG sense of it) if I add the page to “Page Holder” and then click the “Links” button. Ouch.

  23. Add that among the Mac-based browsers, IE 5 appears to be the only one that highlights the imagemap’s polygons as I tab-navigate the page.

    For as buggy and “dead” as IE 5 is, it still has some features that have yet to be matched by the modern browsers. I personally miss the way the embedded web archive tool worked, and the aforementioned Page Holder and Link functionality”¦ But I digress.

  24. I agree and have been preaching the same however Search Engine Optimisation is more than just on page optimisation. There is inbound links and pay per click etc”¦.

  25. Maybe it’s just because I work in such a small town, but usually all our clients want to know is “Will this improve my Google ranking?” With this article as ammo, I can confidently tell them “Yes.” (That’s all they care to hear anyway. If we told them that pictures of monkeys would improve their Google ranking, they’d be all in.)

    My first site, one I started when I was 16, got to the top of Google’s ranking for many key search terms. It’s littered with poor markup, fancy JavaScripting, and worst of all, it’s all controlled by tables. (I use to think using DIVs anywhere was amateur..) It got to the top of the rankings because it was actually relevant to the key search terms, and was updated frequently.

    So, I don’t know how effective accessibility is compared to links and such, but it’s certainly something I will sell to my clients.

  26. For ease of discussing web development is often easier categorize the field into neat theoretical elements such as ‘Accessibility’, ‘SEO’, ‘Usability’ or ‘Design’, but it is important to stop and realize these are simply labels. In reality when designing a web site all of this principles come into play at the same time. A simple example is that the writing of a

    tag ticks-off three, if not all, of the quoted paradigms.

    Mr. Hagans’s points are valid, and eruditely written; he is at the sharp end of the SEO field and I know by observation that he knows what he is talking about. What he is saying however, is not new or radical, hence why so many here make the point that his argument is commonsensical. What the author has done well is highlight the nature of the natural overlaps that exists in the real world of web design.

  27. I had just decided to put my portfolio website together at long last, and I was already keen on creating a highly accessible website. A week later and I’m top of the google ranking when you search under my name… of course I will probably be off the top by the time some of you read this! Its great to see that following web standards is finally beginning to stand up for itself.

  28. Excellent read. We used this approach a year back and our website was brining in 80% of our web business. Simple, honest and common-sense approach to building and marketing a website.

  29. The entire topic of search engine optimisation ‘standards’ and ‘best practice’ should always be prefixed “in an ideal world…” because the fact is that YES, accessability would help search engine marketability…in an ideal world – the fact is that theres too much grey/black hat seo going on for it to make a dent – im also annoyed that seo idealists always forget that 99% of the business net never gets updates despite a designers best efforts due to lack of client investment, so having sites that ‘have quality content’ completeley discriminates against small business who only have 10ish pages (including privacy, t&c, contact information etc) *end or rant*

  30. The latest buzz word “AJAX” is the problem on the horizon. I know of many browser based software applications that are headed in that direction. This just doesn’t seem to bode well for accessibility in general.

    The information presented was in deed helpful. Thank you.

  31. A month on from a client re-design, and we’ve been tracking Google’s love for the site. We switched from an old table-based site to a brand-spanking new CSS layout. The results are amazing (although it may shift again – it’s only been a month). Went from greater than 300 to number 1 for a few general words and lots of obscure ones.

    So those are the actual results I wanted to see, and they’re just awesome.

  32. I’ve been swearing by less code, more content for a while now. It’s truly amazing what can happen when you have easy traversable urls, keep all your design in a stylesheet and just stick to having lots and lots of content on your pages. I’m in the process of spinning up my personal site results just to prove that spending thousands of dollars with one of these fly by night SEO companies is rediculous. Just make it clean and valuable, you’ll get all the ranking you want.

  33. Obtaining top rankings by unethical tricks by some seos are causing greta problems. Sometimes it gets some good resulsts also. But it will not last for long. Whether search engines finds a solution to block all these tricks, it would be a great thing. We use only ethical things to promote websites. All the steps in optimizing is done with extra care by studying the search engine strategies. You can get more details from here http://wwww.seowebsolution.com

  34. I went through your article “High Accessibility Is Effective Search Engine Optimization”, and found it quite interesting. As far as Google is concerned ,it has recently announced Google XML SiteMaps.

    Google is encouraging each and every webmaster around the world to add special XML file named sitemap.xml on its web domain. This XML file consits of several tags like

    1)
    2) – URL of a webpage
    3) – (0.1 to 1.0 of a web page)
    3) – (Last date on which a webpage was modified
    4) – (monthly,weekly, or yearly of a specific page)

    You can visit http://www.sitemapdoc.com to create a FREE XML SITEMAP for your website. Its absolutely free of cost.

  35. Great article by Andy, as always. From experience as an SEO if the visitor is able to use the website with ease this is one of the goals of Google. If the website is spiderable (good design and navigation) the search engine robots can index the web pages. What is good for the visitor is also be good for the search engine robots in terms of search engine marketing success.

  36. This is my first comment here.

    When I read this article I want to write a comment.

    I am intrested in SEO, and I try to learn it. From some SEO experts I learned something. But not many SEO experts talk about Web Standard, they just talk about many tricks. Later I know css/xhtml is the big “tricks” of SEO. So I got start to learn this.Aand love it so much.

    Really good reading.

  37. Thanks for a good read. It is pretty much common sense to follow accessibility guidelines for effective SEO. No one is ever going to know the Google algorithm, yet building a site to standards is definitely going to put you in a position where your on page optimisation is effective. What makes the difference with competitive search terms is the off page optimisation.

  38. Nothing here about URL structure and its impact on accessibility. Anyone found some recent research about parameter based URLs and to what degree engines are crawling question marks, ampersands, etc.?

  39. I’d like to index my page with the Google sitemap, but using SSIs for the and first part of of all my pages prevents me from creating a new document title for each page, resulting in a repetitive index. I think there’s got to be a better templating method than using SSIs in this way. Please friends, enlighten me.

  40. Here is one page,,,,


    Title – Page 1

    Page 1

    Lots of stuff in here.

    Here is another page,,,


    Title – Page 2

    Page 2

    Lots more stuff in here.

    Different titles. Multiple (two in this case) includes in the same page.

    Kludgy? Maybe.

    Does it work? You bet.

  41. By meeting the Accessibility guidelines, you not only provide disabled people with access to your site, you can provide keyword rich Alt tags that can be indexed by search engines. These can be especially beneficial in image searches. If you are in the travel game, you will probably want to make sure you are using the Alt tags.

  42. what hagan has said is unflawingly true. Accessibilty is the king in the seo rules, as itelf pointed out by google chaps. But it does not really succeed as a single honest tool. I have seen spammed pages get higher rank in google search, and really aceesible pages suffer blackoout swarms. Be accessible first, and then a little mischevious. This is the real moto in seo, as far as sucess goes.

  43. I agree with your viewpoints. Even there is a considerable sized body of practitioners of SEO who see search engines as just another visitor to a site, and try to make the site as accessible to those visitors as to any other who would come to the pages.
    They often see the white hat/black hat dichotomy mentioned above as a false dilemma. The focus of their work is not primarily to rank the highest for certain terms in search engines, but rather to help site owners fulfill the business objectives of their sites. Indeed, ranking well for a few terms among the many possibilities does not guarantee more sales.
    A successful Internet marketing campaign may drive organic search results to pages, but it also may involve the use of paid advertising on search engines and other pages, building high quality web pages to engage and persuade, addressing technical issues that may keep search engines from crawling and indexing those sites, setting up analytics programs to enable site owners to measure their successes, and making sites accessible and usable.
    Still you really did present a great SEO working critireion

  44. Now the company I work for here India (WDC) can be
    found on the www if we use search words that describes our normal line of activities.But here you can’t work with new ideas. You just have to follow the route with single ‘yes’ all the time. Still my efforts do make a lot of difference as an SEO optmizer. Even our main competitors are frightened at the rate of our acceleration.
    I am always the believer of hign accessibilty. Now You can even find us if you search for our straight name in the google on the front page. I love to be a flash programmer soon but still I think my SEO efforts are also extra ordinary. May be our vice president got taken it very serious if it ended up on on his laptop.

  45. Interesting how this article passes the test of time. IMHO, so does table based HTML. I have always had great success with table based HTML 4.01 whether it validated or not. I guess since I started designing way back in the 90’s (a dogs age in internet time), I am inclined to prefer older code that has been tweaked over time. I have never felt that standards made any difference in rankings. I will say the clean code is always the best code and that cleaning a site can effect rankings, so one aspect here could be accurate.

    I suppose this is all beauty in the eye of the beholder.
    🙂

  46. My company is in the process of reconstructing its Navigation Generation tool and unfortunately they developed a tool that generates the navigation as DHTML using bloated Javascript calls. If I could have been in the design phase, I would have tried to convince them to do something like suckerfish (manipulating an unorderd list with CSS). Now that I’m stuck with this, I’m wondering if there is something I can do in paralelle with this menu to offer something SEO friendly and perhaps even accessable too. I was thinking maybe to have a page not only render the regular nav using javascript but also simultaneously render the same menu in a hidden DIV layer as an list for search engines to spider. This scares me because I don’t want to get the infamous BMW ban for having content hidden from human eyes. Does anyone have any suggestions?

  47. Ron, Don’t go down the hidden DIV layer, you are likely to get the site banned by one of your competitors reporting hidden text.

    Your best bet would be to redesign using suckerfish and kill the javascript altogether.

  48. It is true, as I have read many such articles, that certain people regard search engine optimization as a dirty trick. In a way they are not far off the mark, as white hat as it may be, all sorts of tricks and procedures are needed to successfully compete in search engine space. But it is not our fault. We inherited the system from the search engines, not the other way round, and it is these systems that we fiddle around with to find the necessary information about the algorithms before manipulating them. What’s so bad about that?

  49. I was disappointed to hear during a Google webinar event on 22 Oct:

    ‘Webmaster Chat – “Tips and Treats”‘

    that Google places no ranking benefit on a page that is well marked up versus one that is not. They made the point that great content put up by someone who did not know how to semantically markup that content should not be penalised.

    For me, this answer given by Google feels credible but at the same time I am disappointed to hear that one of the many benefits of web standards, that it improves SE ranking, is perhaps not true.

    I will always build to web standards for all the many good reasons, but perhaps I need to temper my enthusiasm for that place in the venn diagram of life where web standards and SEO meet?

  50. In recent google webinar and above comment by Alan Google places no ranking benefit on a page that is well marked up versus one that is not. I think John and Matt where talking about “strict markup” not markup in general. I am sure the information indexed has to be parsed based on mark up and when done properly there would be benefit? Now I am confused…

  51. I (and I think Mike above) would be delighted if a venerable ALA staffer would give their view on this (#65 and #66). Thanks in advance for any additional comments on this. Cheers, -Alan

  52. This article is a great way to show the parallel bewtween good practices and direct benefits for your business.
    I’ve found out that it is really difficult to sell ” web standards” to customers. Nobody (almost) seems to care. The proof of this is that a super high percentage of sites is not standards compliant, and this is not because of lack of resources as I point out in a post on Web Standards and Fortune 500 Companies:
    http://www.aggiorno.com/blog/post/Benchmarking-DOCTYPE-validation-in-Fortune-500s-Web-sites.aspx

    On the other hand, if you tender to the benefits of web standards like accessibility, SEO, redecution in maintenance costs, better chance to display correctly on mobile browsers, etc, you do get some traction in the conversation.

    In any case, I believe Google does care about well written code and I summarize a number of examples in this post:

    http://www.aggiorno.com/blog/post/Web-Standards-and-Search-Engine-Optimization-(SEO)-Does-Google-care-about-the-quality-of-your-markup.aspx

  53. It’s not just personal experience, either. I was introduced to one of the black-hat SEO (for Google specifically) guides last month, where various people measured the effect of certain Google-defeating tricks over time, and the enduring and verified techniques were virtually all compatible with best-practice accessible, semantic site code and content design. Nearly the others faded in usefulness over time, or even became actively penalised by Googlebot, but good, clear and simple code still has my sites
    sohbet 2nd and 4th on an ego-search at the time of writing

Got something to say?

We have turned off comments, but you can see what folks had to say before we did so.

More from ALA