Any internet marketing professional will tell you, just as we will, that an effective Search Engine Optimization campaign can generate more traffic than an expensive banner-ad program, or costly and time-consuming pay-per-click methods. Some of the best methods of optimizing a website are ensuring that a page is not overly heavy in file size, maintaining a good content to code ratio, using lots of relevant content, and filling the page with as much text and links as you can without “spamming” the search engine spiders.
We’re not going to cover all of the basics of XHTML and CSS. We assume that you have a basic, working knowledge of the two languages and have some experience in utilizing them. We suggest reading ALA’s CSS articles, and Zeldman’s Better Living through XHTML if you need more on the basics or more reasons to switch over.
We’re going to be focusing entirely on the benefits of using XHTML and CSS to show you how to improve the readability of your code for search engine spiders, maintain a good content-to-code ratio without going beyond file-size and word-count limits, and how to use CSS to mimic common image effects.
There will be – in no way, shape or form – any unethical methods of SEO covered here. If you’re reading this article in the hopes of learning how to get an adult site listed in the “school supplies” category on Google, we kindly suggest you fall off the face of the earth. Any hate mail regarding this can be directed to sally@morekinky.net. It’s due time to pay her back for all those “petting zoo pictures” that manage to bypass my spam filtering system.
File size and content ratios #section2
The best way to begin an optimization project is to make sure that all your code is readable. Search engine spiders work much the same way that the human eye does, and if there’s too much “junk text” in your HTML, it’s going to be hard for spiders to know what is what and to be able to decide that a page is relevant to a particular category. Improving your structural organization will not only make it easier to read for you, but ensure that search engine spiders know what you’re trying to show them.
Visually, a human reader sees the green Arial text set at 24px and bold green at the top of the page as the main title for your website. However, if you don’t have your markup implemented well enough, a search engine spider will not. Instead of using this to declare your title:
<strong><font color="#00FF00" size="24px">Main Title of My Very First Website</font></strong>
Put this in your XHTML:
<h1>Main Title of My Very First Website</h1>
…And this in your CSS:
h1 {font-family: Arial; size: 24px; color: #00FF00; font-weight: bold;}
Using XHTML to declare your main heading as H1 will ensure that a search engine spider knows that the contained text is the title of the page, while styling it with CSS yields the desired design effect. Using H1 through H6 accordingly, you can use this methodology to style the sub-headings on your website and let the search engine spiders know they are of importance. But don’t forget to use one or two of your keywords in the title—otherwise it won’t make a difference either way.
Using Images Wisely#section3
An H1 tag placed directly after the <body>
tag will be weighted heavily by many search engines, especially if it contains one or two of your keywords. But sometimes, putting a nice big heading tag after your body tag can detrimentally affect your masthead image placement. Using CSS, we can work around this little hang-up in some instances by placing our masthead logo as a background. To do so, the following code would be added to your BODY declaration in your CSS file:
background-image: url(../images/logo.png);
background-repeat: no-repeat;
background-attachment: fixed;
background-position: top left;
Embedding a logo as a background instead of dropping it straight in to the HTML will make that piece of the puzzle fit together nicely. (Unless, of course, you don’t need the image to be presented before your H1 element.)
Pesky Javascript Rollovers#section4
Everyone loves rollover effects. Even my completely dimwitted grandmother likes it when an e-mail link rotates every time she hovers over it. Using CSS allows you to emulate rollover effects with grace and lower file size, but one of the greatest benefits is providing more textual content for spiders to “read.” Using CSS to dictate rollover effects instead of separate images will give you a marginal, yet effective advantage in the fight for search engine positioning, especially if the textual link is one of your keywords.
Tim Murtaugh’s Mo’ Betta Rollovers is a great guide on how to use HOVER states to emulate common image rollovers.
Wrapping it all up#section5
There are many ways to increase a website’s search engine positioning. Some of them are extremely effective while others are offensively unethical. Using standards compliant methods to create highly readable, content rich code with optimized content can aid in your campaign to dominate the search engine category. Every website has a place on the net, just make sure you know where your place is.
while this article is mostly common sense tips, it is nice to know that there are added benefits to authoring simple markup with xhtml/css.
I wish you had figures to back your arguments.
I’ve employed
for a bit, though with nagging suspicions that this is less than commendable. It does validate just fine, thank you, and Google does see the alt text. Any reasons to avoid this technique?
David, there’s nothing wrong with that technique. It provides design control while maintaining accessibility and avoids problems some screen readers have with the Fahrner/Bowman CSS image replacement technique.
ALA and Brandon, thanks for this article. I knew most of these arguments but it is nice to have them posted here … it gives me more ammunition when persuading clients that clean XHTML markup and CSS layout are good for them.
For those in the know on CSS/XHTML, most (if not all) of these ideas presented will be second-nature, and things you’ll most likely be doing already.
Re: David’s H1 title image: While David’s technique is sound in that it provides textual information for spiders as well as the proper placement of a header image, the H1 status does not follow through to the alt text. A search engine spider will see it only as alt text, not as a main level heading or title of page. Since I have not tried this method before, it is hard to say if this method would prevail over
Title textual information here
.
As Brandon said, all of the techniques presented in this article are in the arsenal of the XHTML/CSS-savvy already. The typical ALA reader will certainly be well-versed in such techniques.
For a more complete guide to search engine optimization, as part of a comprehensive look at web site optimization in general, let me recommend Andy King’s book “Speed Up Your Site: Web Site Optimization.” The accompanying web site ( http://www.websiteoptimization.com/ ) seems to be down at the moment, but the book can be found at Amazon.com ( http://www.amazon.com/exec/obidos/tg/detail/-/0735713243/103-3673928-6194224 ).
You could have replaced “completely dimwitted” with “”
Thanks!
It could have been a good article – the intro suggested this, but it ended up being /very/ thin on information. Hmm use H1 – H6 contexts? Ok. what else ? Use of Keywords ? Importance of links ? although mentioned their importance wasn’t signified where it counts. How did this get OK’d to be published ? Its hardly worth reading!
laurent@bearteam.org asked for figures to support the claims. This site: http://www.einfachfueralle.de/ was relaunched in early May and through the use of h1-h4, FIR, (almost) no text-GIFs and whatever is recommended in the article we made it from somewhere around page 20 to positions in the lower single digits for all relevant keywords. Go figure.
/T
“I wish you had figures to back your arguments.”
Not hard numbers, buuut… We designed a site without paying any attention to SEO, just standards, and it did very well on that alone. A little extra SEO and now it’s #1 for it’s main search terms.
“Use of Keywords ? Importance of links ?”
The title says it all. What you mention are SEO strategies, nothing to do with CSS or XHTML.
Another excellent resource for SEO would be http://www.webmasterworld.com.
I’ve got to agree with Dude. This article was REALLY thin. I was expecting a second and third page, but it just ended.
I was looking forward to reading this one, but it was all common sense stuff – nothing new at all really.
I might have been a good article if it had been backed up by some good research, or gave some insight into what effect the position of different (x)html elements withinin the documents had on different search engines.
I think what the most important thing to remember when coding (x)html is to be honest and represent your content fairly. If something is the main header, you mark it up with H1, next level headers get a H2 tag, etc. Use lists to represent lists.
Quite simply, utilise HTML properly, then format it with CSS and the search engines will index your site in the best way possible.
I’ve always used VSE Be Found (http://www.vse-online.com/submit-website/index.html). Although it’s only for Mac (and Classic at that), there’s no better assistant for pre-launch meta analysis/optimization. Plus it submits to all of the major search engines. It also explains the things you can do wrong, which may get you penalized with certain search engines (ie. use of repetitive keywords, etc). Their site is also very informative if what you’re looking for wasn’t covered by this article.
This article was in no means meant as a complete guide on optimizing a site. Its primary purpose was to inform the knowledgeable CSS and XHTML author about things they are already familiar with and how implementing them will improve upon their search engine optimization.
Mr. Ward said it best when he states that using (X)HTML properly and formatting it to your design specifications with CSS will result in the best chances of being indexed properly.
One of the greatest benefits of using CSS/XHTML to optimize your website is the separation of content from code. Quite simply, the less code you have in your HTML, the more room your content has to speak for itself. None of these techniques necessarily will guarantee higher rankings, nor are they guaranteed to improve your rankings if you don’t have quality content to back them up with.
Speaking along the lines of using headers, it could be just as simply done by creating a class called .pageheader and styling it exactly the same. The question arisen from that, though, is whether or not a search engine will will be able to tell you that
is to be the equivalent of
Title Of Page
– as most if not all search engines know exactly that H1 is used to designate the primary heading of a page.
While it may go against usability standards at times, another benefit of using standards based design to optimize your pages is the ability to order your code in any position you wish and place it appropriately on the webpage with CSS positioning. Quite simply, you can place your heading at the very top of your HTML followed by a paragraph briefly describing your content and then your content. Or you could place your heading first, and then a sidebar blurb, much like the one placed at the top of the sidebar adjacent to ALA articles, followed directly by the primary content. CSS could then be used to place these texts in their respective positions on the webpage.
There are three key factors to optimizing a website: The first, and by far most important, is quality content. Copywriting can be a tedious task which many of us take for granted. Without enough text about the given subject, your page is highly unlikely to be found relevant at all to any search terms. The second is coding your webpages. Using XHTML you can create semantically correct webpages without sacrificng your design schemes. The third, and equally important, is follow-through. Maintaining an optimized page throughout content updates and the like is just as important as your webpage gets re-visited.
Some links for more information:
http://www.bruceclay.com/searchenginechart.pdf – Bruce Clay’s Search Engine Relationship Chart
http://hotwired.lycos.com/webmonkey/01/23/index1a.html – Search Engine Optimization – FREE
http://www.searchenginewatch.com – Danny Sullivan’s Search Engine Watch
http://www.searchengines.com/
When I saw this article mentioned on Zeldman, I thought it might be a great article to send to a client who knows her company home page needs some SEO work and is trying to rationalize a budget for doing it. But, this article offered NOTHING to use. Not to mention the nasty, patronizing tone, such as a reference to “completely dimwitted” grandma. What a waste of bandwidth.
If you have an image for your logo or keywords, you could also put an h1 containing the company name (for example) above that image. Then just hide that h1 (i.e. h1#hidden {display: none}).
To be fair – as Sonia pointed out – the title of the article does put emphasis upon use of XHTML, CSS [..Standards..] ect to improve SE ranking – so in that respect the article does explain that quite well, but I think all conscientious standards advocates out there will be doing everything stated in the article anyway so doesn’t that put all [to coin a sub-set of continuously used phrases] ‘lean, low bandwidth, elegant XHTML’ front ends on an immediately equal ranking ? It would have been nice to see some ‘XHTML based’ SEO techniques which aren’t necessarily use by everyone. Something along the lines of that suggested in ‘hide the h1’ post would be an example as used in StopDesign.com [knock out the CSS to see this] – so would this improve SEO ?
The article covers general info that most XHTML/CSS authors already know, although it’s always interesting to see how someone else does it. I teach an HTML authoring class, and always get nailed with the Search Engine question, “How can I get my site to rank highest?” I give them a two part answer: a) Lots of $$$ (ads, etc.) or b)structuring their xhtml well, letting css do the presentational work, and writing good copy. The b part of the answer gets mixed results, but when it works it works well, and you save that yearly fee to rank your site. Thanks for writing the article, my students read ALA and the Daily Report and this will help…
I’d have liked a bigger article but once the point had been made there was no need to repeat it. From my own experience I have seen very good SE results through clean pages that rely heavily on CSS for as much presentation as daringly possible.
I suppose once everyone discovers it’s additional benefits we’ll be back to square one again with all SEO more or less doing the same thing.
In the meantime reap the benefits from the search engines while you can.
This article is very timely. I’m planning on mentioning this very topic (SEO benefits of clean XHTML and CSS) while talking to the Macromedia User’s Group I belong to. Any articles like this that point out benefits, regardless of what they may be help a great deal.
Thanks for putting this together, Brandon.
While good web developers will know why XHTML+CSS is a Good Thing, many managers/clients will refuse to believe it until someone else says so. Hence, being able to reference a document online adds weight to your argument.
Plus, I’d never really thought about the effects on search engine ranking that standards compliance could have… so it was good to bring the point to my attention.
The grandma crack does make it risky to give out the URL though – should have been edited out. Beautiful code doesn’t mean perfect copy! 🙂
I was totally surprised by the length of the article. There should’ve been more material in it. As it is… it looks more like a blog post than an ALA article. (And no, don’t tell me that ALA is a blog.)
I’m quite bemused by some of the comments in this discussion so far. I mean, SEO is a sub-function of Internet Marketing, and yet some of the marketers who’ve commented seem to have completely missed the fact that they are not your primary audience.
The opening paragraph should have informed them:
“ANY INTERNET MARKETING PROFESSIONAL will tell you, just as we will…”
As a professional Internet Marketing Consultant of many years standing I agree. For an audience of web designers (professonal and amatuer alike) the advice that using CSS and XHTML can help with SEO is sound. Leaner code, faster downloading of pages, the ability to use a H1 tag without it defaulting to huge chunky size, rollovers that can be crawled – all great benefits.
As for some of the questions and ‘tips’ in the posts above, well, no major search engine gives as much ‘weight’ or credence to ALT text as to regular text. Even if you put H1 tags around it.
keywords
will beat
in every engine that matters.
In fact, in some of the most important engines,
keywords
will beat
too, making it far from ‘optimal’ for search engines.
“put an h1 containing the company name (for example) above that image. Then just hide that h1 (i.e. h1#hidden {display: none}).”
The trouble with this is that idiots misuse it. Google is very clear about hiding anything: Don’t.
“Quality Guidelines – Specific recommendations:
1. Avoid hidden text or hidden links.” says Google
http://www.google.com/webmasters/guidelines.html
You see, if you are in a market that is competitive enough to *need* to use hidden content, then conversely, it is *too* competitive to use hidden content. One of the competitors *will* make a spam report to Google (after examining your code to see how you rank well).
While some very good sources have been mentioned already, my final point is to offer one more:
http://www.cre8asiteforums.com/
Owned by a professional Usability and UI Consultant who also does SEO, yet with plenty of the more traditional SEO related topics of course, Cre8asite takes a slightly more holistic view of marketing than many webmaster forums.
Re: Mark Orbits comment that we’ll be back to square one once everyone discovers the benefits of good XHTML and CSS coding, I don’t agree!!
I run a website, unfortunately I am only involved with the content and moderation issues and do not get a say on the coding behind the pages. We are running the site using PHP Nuke and the code it creates (or the template creats) is hideous.
We also struggle to rank well in a search engine because of all the crap in the code. How does a search engine know what is a title on any of my pages? What parts are lists? The semantic meaning of my site is currently close to nil.
Any site that has good semantic definition will allow the search engines to give them fair representation. Currently some parts of my site (footers for example) are being given much more weight than they deserve by the search engines. I could sort that problem if my site was properly marked up.
The URL to my web site is in the above link if anyone is interested in looking at the code (it’s not a pretty sight!!)
I may just have a go at a PHP-nuke template one of these days and see if I can get my site using good code!!
Missed out the URL!!
http://centurions.rlfans.com
Articles are like code. It’s either lean or bloated. Come to think of … any form of communication is like code. Lean or bloated.
The author said what he had to say … and that’s that.
Let’s look at this from a utopianistic perspective: It is the future. Everyone who makes websites are using standards. .01% of websites are obsolete. We’ve all learned to use clean, valid, semantically correct markup. There’s no mixing of content and design anymore.
In this future, every website is rendered equal; on the same level. Search enging spiders can read them all with the same ease. Does this mean google is going to have to find new algorithims to rank websites? Absolutely not. If everyone were using standards the only algorithim necessary would be relevancy. One website on cardiovascular workouts is going to be more relevant than the other, guaranteed.
The future is not about writing great code. It’s about writing good content.
Wow,
As usual Ammon makes some great points. Cre8asite is an excellent resource.
WRT the hiding the h1 image technique written on Stop Design and mentioned here, it does involve hiding text and could be considered a little risky wrt Google. (Now that StopDesign has comments available, maybe someone should post a ‘warning’)
We got dinged by Google on one site when using Eric Meyers hide/show menu text using a ‘hidden and revealed’ span – a technique/example that he uses on his website.
Another cutting edge thing that may or may not be dangerous:
The excellent ‘Pure-CSS Tabs’ (http://kalsey.com/2003/05/css_tabs_with_submenus) are something that we are afraid to try due to our bad luck with Eric Meyers technique described above. We figure (but what do we know, really – this is speculation based on experience) that Google seems not to mind hidden/shown elements when mixed with Javascript, but maybe bites you when it’s pure CSS.
“This article was in no means meant as a complete guide on optimizing a site. Its primary purpose was to inform the knowledgeable CSS and XHTML author about things they are already familiar with and how implementing them will improve upon their search engine optimization. ”
why bother?
if the authors are already knowledgeable and familiar with information, why would we care? what would be useful is a well-written article that we could use to convince Marketing Professionals, Executives and everyone else with their fingers in a project why XHTML and CSS are useful.
this article spends more time telling the reader “what they already know” and what’s not going to be covered than it does in providing useful information.
even some simple discussion of what search engines you should target, and why XHTML and CSS are great food for primary SE spiders would have been useful.
this article is thin, and far from the quality i’ve come to expect from ALA.
too short.
I think many people who have replied here seem under the impression there is some kind of magic to SEO. I know people like Ammon have spent many years learning about Search Engines and various marketting techniques to really know them in great detail and be able to get as much out of them as possible. But the basic of SEO have been the same for a few years, good content with unfussy, well marked up code.
The fact is many people don’t go along with web standards and don’t think to make the best of their text. The article is perfectly acceptable as it is, it is about how XHTML can help you with search engines, not specific tricks to get ahead of everyone else, its tricks that people like Google don’t like and why things like hiding H1 is a dodgy tactic. I contacted Doug Bowman about what I felt was a risky technique for search engines as far as the FIR trick was concerned for the simple fact that they don’t like hidden text!
Yes following web standards can help with SEO, if your already following web standards then of course there wasn’t going to be anything new in the article, thats what it was all about! However, people who want to know more, or need convincing of the pros of using valid markup are more the kind of target audience of the article.
For most of you who are well versed in CSS/XHTML already a simple line to say ‘Using XHTML in accordance with the W3C standards can help in getting a better rank in Search Engines’, but that wouldn’t really of made much of an article at all and wouldn’t have helped the people who needed the ‘how will it help’ type info. Its one thing doing it, its another to understand why you do it. Something I think the likes of Zeldman and Meyer generally do very well in looking at the ‘bigger picture’.
When I converted my personal site over to Movable Type earlier this year I made a concerted attempt at setting everything in xhtml and css. I used many all of the techniques Brandon mentions in his article and my site’s Google click through rate has gone up at an unbelievable rate. Mind you one of the most popular searches is “Tights Gallery” – I have some pictures in my gallery of a Tie and Tights party we had – all very innocent but I suspect the visitors to those pages are somewhat disappointed!
I have to agree that structural mark up is paramount and I still can’t believe how many sites ignore this!
Brandon mentions the importance of having the title in the text, not just an image. However, having the author’s name in text is important as well. His only appears in the image. Nowhere in text searchable by a spider. He is invisible to the Web search engines.
I really don’t know why you bothered. Not wishing to be egotistical but you really have a lot to learn about CSS. What you did using JavaScript can also be done using CSS and HTML.
And I know you were merely explaning how to optimize a web page for the purpose of Search Engine Indexing, but come on, is it really necessary to waste time using JavaScript when a little experimentation with CSS and HTML will also do the trick? Articles like this are most often misleading and a complete waste of time for those in the know. If you wish to show us something, at least take the time to do it properly.
If you are going to tout using CSS then please use correct CSS in your one and only bit of code. The correct property is FONT-SIZE not SIZE. The SIZE property controls the size and oreintation of a page (for printing). See http://www.w3.org/TR/REC-CSS2/page.html#propdef-size
… to say about SEO?!
And that was constructive HOW? If you’re gonna complain then at least keep it relevant…one could say “make the most of your text” in that regard.
Ammon: thanks for your tips. But what do you think of my hidden logo and title in the http://www.via-israel.com ? I coded this since the logo is already embeded in the background image and I am lazy to positionate some transparant PNG logo with a javascript/activeX hack… Anyway the logo and title are hidden but not for pda/smartphones/lynx-like/NS4…
Very little information. But maybe that’s all the information there is on this topic. If so, fair enough.
I feel sorry for his Grandmother.
This was way below the standard I’ve come to expect and enjoy from ALA stories in the past. ALA’s strapline is “for people who make websites”, right? Nothing covered in this article was in any way ‘extended technique’ for anyone who’s ever read an ALA story before.
I don’t expect one of the three (?) main points of an ALA story to be ‘use heading tags’.
I have just started a web site, http://www.zone4health.com, and am desperately looking for more cost-effective ways to get the name out there. For the novice, this was a brilliant article. You have my utmost appreciation!
Jerome asked: “…what do you think of my hidden logo and title in the http://www.via-israel.com ? I coded this since the logo is already embeded in the background image and I am lazy to positionate some transparant PNG logo with a javascript/activeX hack… Anyway the logo and title are hidden but not for pda/smartphones/lynx-like/NS4… ”
Hi Jerome. I certainly understand your reasons, and the implementation is both practical and sensible. You do have a hidden H1 heading, but it repeats exactly what would otherwise be hidden from user-agents that do not support images. In other words, what you have is a cross-browser compatibility device rather than a method for hiding keywords.
In fact, I commend you for not trying to take further advantage. It would be easy for many to think that if you were going to have a little hidden text anyway, legitimately, that adding in a couple of extra keywords wouldn’t hurt. Your honesty is your final and strongest protection.
You see, even were some jealous competitor to personally report your hidden text to Google or any other search engine, I believe that the employee following up on that report would determine that there was no intention to decieve or ‘trick’ the spider, and so would reject the complaint immediately.
What I’m saying is that the technique you have used has been used well and with honest intention. That makes a big difference in the final analysis.
In fairness to all at ALA, I wish to ask that any further questions or requests for analysis be brought to the forums where I and others give advice freely, so that this discussion doesn’t get side-tracked into discussions of individual sites.
Anyone wishing to ask about specific issues on their sites, rather than comment about the article and issue under discussion here, is most welcome to seek me out at http://www.cre8asiteforums.com/
Dudes, you really need to learn to use CSS properly and update your web site. I used to be able to read “titchy small tiny fonts” like many of you egotists seem to prefer, but in the long run, the only thing that “titchy small tiny fonts” do to a person is to make them blind.
Update your CSS to enable older bods with eyesight that is less than perfect (like meself) to be able to change the fonts to suit us.
ROFLAO, watch em squirm as they scramble to find out how to achieve this. I bet they don’t update their web site to enable this feature.
Because if you are going to do something, either do it to the best of your abilities, or don’t bother doing it.
While these tips are good for optimizing the returned results of a search that hits your site [and there’s a lot to be said for that] I don’t really know how much this advice could help one’s actual search ranking. I don’t think there are any big engines anymore that actually use the page itself for ranking outside of keyword appearance.
Having a legible summary on a search engine results page can be extremely valuable though.
You’re giving away all the search engine optimisation secrets! Free! 😉 Just kidding – it’s a really great article and I’ll be pointing my clients to it to help them understand the basics. Good one!
Edward, when your using server side languages like PHP, the code, as I’m sure you know, is processed on the server before being exported as standard HTML to the browser.
When an SE Bot requests a web page from a web server, the same process applies, the page is requested, the PHP engine processes it and outputs the product to the useragent.
With the kind of example you posted above, I don’t think any SE will have problems crawling it as if it was any other static HTML page. You can get problems with certain areas of page dynamics, with things like variables being passed in the url for example. Its getting better, but there are things, well otuside the scope of this article, that can be done to make them more ‘Search Engine Friendly’. If you want more info on that kind of thing, I’d suggested following myself and Ammon to wwwcre8asiteforums.com where theres a few people more than willing to help. The article here was very specifically about XHTML and engines and how coding with standards can help. Anything outside that would be better discussed elsewhere I think!
And DudeMan, I don’t think its especially hard to find articles on using relative font sizes instead of px to get user adjustable fonts. If your complaining, you could use a browser like Opera which has a full page zoom function that works whenever……
” if you are going to do something, either do it to the best of your abilities, or don’t bother doing it.”
Isn’t always true in business, your best may take too long, use too many resources to make it viable. Over at http://www.accessifyforum.com its been discussed that maybe it would be better for people to be told if their site conforms to Bobby A or AA (within what the auto tests can handle) rather than just telling them what they need to do to make the site ‘prefect’. That would provide more encouragement to web designers not fully conversant with Accessibility to say that they have at least made an effort, rather than demanding they use an all or nothing approach to get any kind of recognition of it.
Its always nice to aim for the best, but people also have to learn and businesses need to take ROI into account, can they justify the extra expense to make it ‘perfect’ when 85% ‘perfect’ may be good enough?
Shame about the syntax errors. The deprecated font size attribute accepts a number or a percentage, it doesn’t take units. There is no size property in CSS, there’s a font-size property – but its a really bad idea to use pixels for it (http://diveintoaccessibility.org/day_26_using_relative_font_sizes.html).
Nice reading, (except for your granny, curious remark, that)
though not groundbreaking news for web designers,
as often to be found here.
More often than not, seo-required H1 tags interfere with my layout needs
so I get rid of it’s formatting entirely using margin, float and line-height.
Demo at http://www.byteshift.de/tips/get-rid-of-h1-formatting
I can’t see why it makes any difference to the search engine crawlers when they read a webpage, whether it starts with an
right after the body tag. The thing is a computer right? What difference does it make to the computer if it is just looking for the
tag? White space, garbage or script, what difference does it make as long as the computer recognises it isn’t
?
I can see there might be a content length limit, i.e. a couple of lines.
See what I mean? That’s why it makes no sense to me to worry about having your logo JPG before your
title. Or has google etc actually stated something along these lines?
Thanks for the article. Other than the grandma comment, which was rude, the article was brief but helpful. I’ve been doing SEM for years but haven’t had the opportunity to build a pure CSS-driven site yet. I’m looking forward to it, and if I need a little ammunition to convince others that it’s worthwhile, this article may help a little bit.
By the way, my grandmother at 81 is making lovely use of the web.
Cheers,
Gradiva
Can’t believe you’d call your grandmother “dimwitted.” That’s just mean.
It’s interesting that the article doesn’t mention the (X)HTML code structure itself. For example, with the magic of CSS, I can have a left menu that actually gets coded on the page /after/ the main content, so even though my menu is entirely text-based, contains H1 tags, etc, when google spiders my page, it’ll use the beginning text – which is actually content – as the small summary in its results.
Example -> http://quotes.prolix.nu/
This site uses a tweaked version of the blue robot CSS code. Good stuff.
Hi, i’m new to CSS layout. After 6 years of mucking up HTML, i discovered this method really recently, after being struck by espn’s site.
Not seeing any tables was a shock. This article gave me a lot of simple pieces of advice. I will source it when i try to convince the management to switch to CSS layout for our clients.
let’s say i don’t want to use a h1 tag, just an image? is there a way so that i still reveive good search egine results? thx
Another potential benefit for SEO of CSS design is the ability to position content in the hierarchy of the HTML in order to have the most important (keyword-rich) content at the top.
Example: the chairman wants his 500-word mission statement appear at the top of the page, with product information below it. CSS-P saves the day by allowing us to position the content that the search engines require at the top of the HTML and yet display the chairman’s waffle at the top of the displayed content.
After reading the article I thought that it is time to put it to the test. I proposed to my marketing team to implement across Europe and see what results we get. Will provide feedback on rankings in 1 months time.
In general terms, I’d say that the article is right, but I think that any designer hoping that his/her rankings will get a boost only because of the code/content ratio is in for some nasty surprises. It’s really one of the ranking factors, but is not by far one of the most importants.
And, of course, it’s not necessary for a web to be in XHTML+CSS to achieve that ratio, old HTML+CSS will do fine, but I agree, it’s good that these good practices gets rooted in he XHTML way of coding.
You can get a good placement without using a h1, but images has nothing to do with your placement 😉
You called your grandmother completely dimwitted — enough said!
Googlebot happens to love well marked-up websites + Googlebot loves a good code to content ratio.
XHTML / CSS offers you both.
I honestly think that the amount of code does not matter as long as the site is quick loading, and has at least some amount of content to be indexed (or sufficient link popularity to not need to worry about on the page factors.)