High Accessibility Is Effective Search Engine Optimization

by Andy Hagans

71 Reader Comments

Back to the Article
  1. A month on from a client re-design, and we’ve been tracking Google’s love for the site. We switched from an old table-based site to a brand-spanking new CSS layout. The results are amazing (although it may shift again – it’s only been a month). Went from greater than 300 to number 1 for a few general words and lots of obscure ones.

    So those are the actual results I wanted to see, and they’re just awesome.

    Copy & paste the code below to embed this comment.
  2. test

    Copy & paste the code below to embed this comment.
  3. I’ve been swearing by less code, more content for a while now. It’s truly amazing what can happen when you have easy traversable urls, keep all your design in a stylesheet and just stick to having lots and lots of content on your pages. I’m in the process of spinning up my personal site results just to prove that spending thousands of dollars with one of these fly by night SEO companies is rediculous. Just make it clean and valuable, you’ll get all the ranking you want.

    Copy & paste the code below to embed this comment.
  4. Obtaining top rankings by unethical tricks by some seos are causing greta problems. Sometimes it gets some good resulsts also. But it will not last for long. Whether search engines finds a solution to block all these tricks, it would be a great thing. We use only ethical things to promote websites. All the steps in optimizing is done with extra care by studying the search engine strategies. You can get more details from here http://wwww.seowebsolution.com

    Copy & paste the code below to embed this comment.
  5. I went through your article “High Accessibility Is Effective Search Engine Optimization”, and found it quite interesting. As far as Google is concerned ,it has recently announced Google XML SiteMaps.

    Google is encouraging each and every webmaster around the world to add special XML file named sitemap.xml on its web domain. This XML file consits of several tags like

    1)<URL>
    2)<loc> – URL of a webpage
    3) – (0.1 to 1.0 of a web page)
    3)<lastmod>  – (Last date on which a webpage was modified
    4)<changefreq> – (monthly,weekly, or yearly of a specific page)

    You can visit http://www.sitemapdoc.com to create a FREE XML SITEMAP for your website. Its absolutely free of cost.

    Copy & paste the code below to embed this comment.
  6. Hi! Thank you for these valuable informations…
    Greetings from Germany

    Sam.

    Copy & paste the code below to embed this comment.
  7. Great article by Andy, as always. From experience as an SEO if the visitor is able to use the website with ease this is one of the goals of Google. If the website is spiderable (good design and navigation) the search engine robots can index the web pages. What is good for the visitor is also be good for the search engine robots in terms of search engine marketing success.

    Copy & paste the code below to embed this comment.
  8. This is my first comment here.

    When I read this article I want to write a comment.

    I am intrested in SEO, and I try to learn it. From some SEO experts I learned something. But not many SEO experts talk about Web Standard, they just talk about many tricks. Later I know css/xhtml is the big “tricks” of SEO. So I got start to learn this.Aand love it so much.

    Really good reading.

    Copy & paste the code below to embed this comment.
  9. Thanks for a good read. It is pretty much common sense to follow accessibility guidelines for effective SEO. No one is ever going to know the Google algorithm, yet building a site to standards is definitely going to put you in a position where your on page optimisation is effective. What makes the difference with competitive search terms is the off page optimisation.

    Copy & paste the code below to embed this comment.
  10. Nothing here about URL structure and its impact on accessibility.  Anyone found some recent research about parameter based URLs and to what degree engines are crawling question marks, ampersands, etc.?

    Copy & paste the code below to embed this comment.
  11. Derek “asked”:http://alistapart.com/comments/accessibilityseo?page=3#22 :

    Isn’t it in Zeldman’s book where Google is called the richest blind web surfer?


    Yes. Yes “it is”:http://books.google.com/books?q=blind+billionaire .

    Copy & paste the code below to embed this comment.
  12. I’d like to index my page with the Google sitemap, but using SSIs for the <head> and first part of <body> of all my pages prevents me from creating a new document title for each page, resulting in a repetitive index. I think there’s got to be a better templating method than using SSIs in this way. Please friends, enlighten me.

    Copy & paste the code below to embed this comment.
  13. Here is one page,,,,

    <!—#include file=“head1.html”—>
    <title>Title – Page 1</title>
    <!—#include file=“head2.html”—>
    <div>
    <h1>Page 1</h1>

    Lots of stuff in here.

    </div></body></html>

    Here is another page,,,

    <!—#include file=“head1.html”—>
    <title>Title – Page 2</title>
    <!—#include file=“head2.html”—>
    <div>
    <h1>Page 2</h1>

    Lots more stuff in here.

    </div></body></html>

    Different titles. Multiple (two in this case) includes in the same page.

    Kludgy? Maybe.

    Does it work? You bet.

    Copy & paste the code below to embed this comment.
  14. With reference to Max’s comment: After all these years, content has managed to keep it’s royal status and most probably for eternity.

    Copy & paste the code below to embed this comment.
  15. By meeting the Accessibility guidelines, you not only provide disabled people with access to your site, you can provide keyword rich Alt tags that can be indexed by search engines.  These can be especially beneficial in image searches. If you are in the travel game, you will probably want to make sure you are using the Alt tags.

    Copy & paste the code below to embed this comment.
  16. what hagan has said is unflawingly true. Accessibilty is the king in the seo rules, as itelf pointed out by google chaps. But it does not really succeed as a single honest tool. I have seen spammed pages get higher rank in google search, and really aceesible pages suffer blackoout swarms. Be accessible first, and then a little mischevious. This is the real moto in seo, as far as sucess goes.

    Copy & paste the code below to embed this comment.
  17. It would really be good if all designers start following W3 rules.
    XHTML designs are really better for good Search Engine Placement.

    Copy & paste the code below to embed this comment.
  18. I agree with your viewpoints. Even there is a considerable sized body of practitioners of SEO who see search engines as just another visitor to a site, and try to make the site as accessible to those visitors as to any other who would come to the pages.
    They often see the white hat/black hat dichotomy mentioned above as a false dilemma. The focus of their work is not primarily to rank the highest for certain terms in search engines, but rather to help site owners fulfill the business objectives of their sites. Indeed, ranking well for a few terms among the many possibilities does not guarantee more sales.
    A successful Internet marketing campaign may drive organic search results to pages, but it also may involve the use of paid advertising on search engines and other pages, building high quality web pages to engage and persuade, addressing technical issues that may keep search engines from crawling and indexing those sites, setting up analytics programs to enable site owners to measure their successes, and making sites accessible and usable.
    Still you really did present a great SEO working critireion

    Copy & paste the code below to embed this comment.
  19. Now the company I work for here India (WDC) can be
    found on the www if we use search words that describes our normal line of activities.But here you can’t work with new ideas. You just have to follow the route with single ‘yes’ all the time. Still my efforts do make a lot of difference as an SEO optmizer. Even our main competitors are frightened at the rate of our acceleration.
    I am always the believer of hign accessibilty. Now You can even find us if you search for our straight name in the google on the front page. I love to be a flash programmer soon but still I think my SEO efforts are also extra ordinary. May be our vice president got taken it very serious if it ended up on on his laptop.

    Copy & paste the code below to embed this comment.
  20. Interesting how this article passes the test of time. IMHO, so does table based HTML. I have always had great success with table based HTML 4.01 whether it validated or not. I guess since I started designing way back in the 90’s (a dogs age in internet time), I am inclined to prefer older code that has been tweaked over time. I have never felt that standards made any difference in rankings. I will say the clean code is always the best code and that cleaning a site can effect rankings, so one aspect here could be accurate.

    I suppose this is all beauty in the eye of the beholder.
    :)

    Copy & paste the code below to embed this comment.
  21. I need some proof, lets have some rock solid proof. With that I can do wonders!!

    Copy & paste the code below to embed this comment.
  22. Effective Search Engine Optimization is SEO without SEO-Tricks, see Google’s guidelines.
    Your competitor is watching you.

    Copy & paste the code below to embed this comment.
  23. My company is in the process of reconstructing its Navigation Generation tool and unfortunately they developed a tool that generates the navigation as DHTML using bloated Javascript calls.  If I could have been in the design phase, I would have tried to convince them to do something like suckerfish (manipulating an unorderd list with CSS).  Now that I’m stuck with this, I’m wondering if there is something I can do in paralelle with this menu to offer something SEO friendly and perhaps even accessable too.  I was thinking maybe to have a page not only render the regular nav using javascript but also simultaneously render the same menu in a hidden DIV layer as an list for search engines to spider.  This scares me because I don’t want to get the infamous BMW ban for having content hidden from human eyes.  Does anyone have any suggestions?

    Copy & paste the code below to embed this comment.
  24. Ron, Don’t go down the hidden DIV layer, you are likely to get the site banned by one of your competitors reporting hidden text.

    Your best bet would be to redesign using suckerfish and kill the javascript altogether.

    Copy & paste the code below to embed this comment.
  25. White SEO – good seo.

    Copy & paste the code below to embed this comment.
  26. It is true, as I have read many such articles, that certain people regard search engine optimization as a dirty trick. In a way they are not far off the mark, as white hat as it may be, all sorts of tricks and procedures are needed to successfully compete in search engine space. But it is not our fault. We inherited the system from the search engines, not the other way round, and it is these systems that we fiddle around with to find the necessary information about the algorithms before manipulating them. What’s so bad about that?

    Copy & paste the code below to embed this comment.
  27. I was disappointed to hear during a Google webinar event on 22 Oct:

    ‘Webmaster Chat – “Tips and Treats”’

    that Google places no ranking benefit on a page that is well marked up versus one that is not. They made the point that great content put up by someone who did not know how to semantically markup that content should not be penalised.

    For me, this answer given by Google feels credible but at the same time I am disappointed to hear that one of the many benefits of web standards, that it improves SE ranking, is perhaps not true.

    I will always build to web standards for all the many good reasons, but perhaps I need to temper my enthusiasm for that place in the venn diagram of life where web standards and SEO meet?

    Copy & paste the code below to embed this comment.
  28. In recent google webinar and above comment by Alan Google places no ranking benefit on a page that is well marked up versus one that is not. I think John and Matt where talking about “strict markup” not markup in general. I am sure the information indexed has to be parsed based on mark up and when done properly there would be benefit? Now I am confused…

    Copy & paste the code below to embed this comment.
  29. I (and I think Mike above) would be delighted if a venerable ALA staffer would give their view on this (#65 and #66). Thanks in advance for any additional comments on this. Cheers, -Alan

    Copy & paste the code below to embed this comment.
  30. This article is a great way to show the parallel bewtween good practices and direct benefits for your business.
    I’ve found out that it is really difficult to sell ” web standards” to customers.  Nobody (almost) seems to care.  The proof of this is that a super high percentage of sites is not standards compliant, and this is not because of lack of resources as I point out in a post on Web Standards and Fortune 500 Companies:
    http://www.aggiorno.com/blog/post/Benchmarking-DOCTYPE-validation-in-Fortune-500s-Web-sites.aspx

    On the other hand, if you tender to the benefits of web standards like accessibility, SEO, redecution in maintenance costs, better chance to display correctly on mobile browsers, etc, you do get some traction in the conversation.

    In any case, I believe Google does care about well written code and I summarize a number of examples in this post:

    http://www.aggiorno.com/blog/post/Web-Standards-and-Search-Engine-Optimization-(SEO)-Does-Google-care-about-the-quality-of-your-markup.aspx

    Copy & paste the code below to embed this comment.
  31. It’s not just personal experience, either. I was introduced to one of the black-hat SEO (for Google specifically) guides last month, where various people measured the effect of certain Google-defeating tricks over time, and the enduring and verified techniques were virtually all compatible with best-practice accessible, semantic site code and content design. Nearly the others faded in usefulness over time, or even became actively penalised by Googlebot, but good, clear and simple code still has my sites
    sohbet 2nd and 4th on an ego-search at the time of writing

    Copy & paste the code below to embed this comment.