Testing Content

by Angela Colter

22 Reader Comments

Back to the Article
  1. Thanks for that pretty good article about testing web content. Actually, what I am missing a bit – the big advantage that the web gave us in communications is interactivity. Don’t communicate like a book, take responses and act with interactive content when you understand your audience.
    Let the user choose her detail, give dynamic hints with bubbles, automatic scaling areas …
    Thats a marvelous feature of the web, but it will make testing quite more complicated, I think. How can we handle this non-deterministic user experience in content (communication)?

    Copy & paste the code below to embed this comment.
  2. Thanks for writing such an informative article. As you outlined, testing content can be challenging. Your example of the mass transit website passes really hits home. I will definitely apply this to future testing!

    Copy & paste the code below to embed this comment.
  3. Thank you Angela. I still come across clients who don’t understand that benefits motivate consumers (even b2b) more than features do. Testing benefit content versus feature content looks like a great way to help such clients understand what works and what doesn’t. In closing, content testing should be added to any comprehensive site inventory.

    Copy & paste the code below to embed this comment.
  4. Thanks Angela. With this article now I know how to move usability from the level of objects to words.

    Copy & paste the code below to embed this comment.
  5. @kanzlei
    You’ve hit on what’s beautiful about usability testing: It often reveals that people do behave in unique ways that we may not have predicted when we built the interface or wrote the content. What you’re calling “non-deterministic UX.”

    With so many moving parts, I maintain that it’s possible—necessary, even—to test how well the content supports what you’re trying to do. How you do this will in part depend on what you’re testing, but even with complex interactions I’d encourage you to start by identifying what task people are trying to user your interface and content to accomplish. If they fail, is it clear why they failed and at what point in the process it happened? Getting users to think aloud and paraphrase what they’re reading can help you zero in on the issue, regardless of whether the interaction is simple or complex.

    Copy & paste the code below to embed this comment.
  6. @Jessica Ivins
    Glad you found the article useful!

    @Robert Moss
    I like the distinction you draw between “benefit content” and “feature content” because you could ask different questions to assess whether people get each type. For features, you might ask “what” questions (What is it? What does it do?) where for benefits you’d ask “so what” questions (How would this help you?) I’d expect the benefit content questions might trigger more personal, colorful responses. Great idea.

    @kalimati
    I love how you summarized this: moving usability “from the level of objects to words.” Beautiful!

    Copy & paste the code below to embed this comment.
  7. Angela, this is a great article rising some key points on user testing for content. I specially like the ideas to do “poor men tests” for the cases where a moderated usability test is simply not possible.

    Thanks a lot for the info.

    Copy & paste the code below to embed this comment.
  8. You’re right, nobody really tests content. My unofficial test for content is whether my average time on the site is going up or down.

    The other two tests you mentioned would be quite difficult to pull off on my site.

    Great post. Makes you think.

    Copy & paste the code below to embed this comment.
  9. I think this is a great entry on the readability of website content. Also if you’re creating text to try to drive up your SEO and it sounds like a robot created the paragraph, then you’ve pretty much lost your reader and thus credibility. The section about the physician and lay person viewing a site was particularly relevant; we all look and read websites relatively the same…with a 15 second attention span!

    Copy & paste the code below to embed this comment.
  10. I too rely on the Flesch-Kincaid Grade Level formula in Microsoft Word. I can’t get over how simple and effective the Cloz test is. The best content article I’ve read in a while. (And the last best one was on A List Apart too.)

    Copy & paste the code below to embed this comment.
  11. Thanks for the excellent article. I’ll be using this as a resource to plan for testing my company’s content.

    I do have a question. You said:

    “Need to convince your boss to budget for content testing? Run it through a readability formula.”

    Forgive me if I’m being dense, but can you clarify how the readability formula reinforces the business case for testing? Do you mean by showing when the content is too complex for the average user to understand?

    Copy & paste the code below to embed this comment.
  12. Melanie, that’s exactly what I mean; showing stakeholders that the content is too complex for the average user to understand.

    The first question is, what does the average user understand? There is an oft-cited statistic that the average adult in the U.S. reads at or below an 8th-grade level. (The study that’s usually cited as the source, the National Adult Literacy Survey, never actually specifies a grade level. It states that nearly half of adults have literacy skills in the lowest two (out of 5) levels, meaning they have inadequate skills for coping with everyday tasks. But I digress.)

    If your organization has already specified a target reading level for it’s audience, use that.

    So the conversation might go something like this: “I’ve run some samples of this text through a readability formula, and it’s estimating that you’d need a 16th-grade education to understand it. That’s much higher than the 8th-grade reading level of the average adult. Maybe we should test this text with our users to see whether it really is a problem.”

    Copy & paste the code below to embed this comment.
  13. A very interesting article. Readablity is something that seems to fall right into a deep hoe with too many clients who want ‘sexy graphics’ and not much else. Rant over.

    My main point though is that using tools such as Kampyle can be very useful too for getting user feedback. It certainly helped us – though we don’t keep it on site all the time.

    Copy & paste the code below to embed this comment.
  14. I will be trying out Cloze testing on some current projects. Thanks for the tip on this. It’s simple, easy to set up and clear to score. Love it!

    Copy & paste the code below to embed this comment.
  15. Thanks for this post. As an SEO guy and editor, I feel like I need to start doing this stuff IMMEDIATELY.

    Good insight. Thanks.

    Copy & paste the code below to embed this comment.
  16. Thanks, Angela, your response to comments is almost as helpful as the article. I had just finished reading the comment “benefits motivate consumers (even b2b) more than features do” and thinking, gosh, that sounds really imp, what does that mean?

    Then you said “I like the distinction you draw between “benefit content” and “feature content” because you could ask different questions to assess whether people get each type. For features, you might ask “what” questions (What is it? What does it do?) where for benefits you’d ask “so what” questions (How would this help you?) I’d expect the benefit content questions might trigger more personal, colorful responses.” Ooooh!

    I often work with health websites for mainstream audiences that are written much too formally and to college readers, when the audience is largely low-literacy adults. Thank you for your language, and your concrete tests, for supervisors. REALLY helpful.

    Your writing is so understandable and actionable (and kind). I went to your blog to sign up and saw you have an post about low-literacy audiences, I can’t wait to read it, thanks!

    Copy & paste the code below to embed this comment.
  17. Jakob Nielsen’s Alertbox has a nice, short overview of the Cloze test. Check it out at http://www.useit.com/alertbox/cloze-test.html

    This is a sidebar to a larger article about Mobile Content Comprehension at http://www.useit.com/alertbox/mobile-content-comprehension.html

    I’ve always maintained that the Cloze test can be done independent of how the text appears in an interface, but Nielsen’s article suggests that the interface can have a profound effect on comprehension as indicated by the difference between Cloze test results using a desktop format and those using a mobile interface format.

    Copy & paste the code below to embed this comment.
  18. Keep on posting such valuable discussion. I am really acknowledged with such matters and wish to have more.

    Copy & paste the code below to embed this comment.
  19. I will certainly discuss this matter with my friends to find the logical conclusion. I will like to hear the further details to discuss the matter in deep.
    http://www.kapellohair.co.uk/hair-extensions-training-courses.php

    Copy & paste the code below to embed this comment.
  20. Hair Extensions
    http://www.kapellohair.co.uk/
    Thanks for writing such an informative article.Thats a marvelous feature of the web, but it will make testing quite more complicated,

    Copy & paste the code below to embed this comment.
  21. Thanks for some great ideas and suggestions, especially some testing tools and protocols I hadn’t heard of!  Three add-on comments/suggestions:

    1) Use an online survey to identify major content comprehension problem-areas you can then focus on (prioritize) during testing. As Content Strategist for a Fortune 500 telecomm company, I ran a survey that first segmented users by primary task purpose (Shop? Order? Learn?) and product interest (Internet? Phone? TV? etc.), then asked (among about a dozen other questions) “What information did you find confusing or hard to understand?”… followed by multiple choice answers. Results were a huge help designing a subsequent usability test that included comprehension of selected pages.

    2) During testing, either have an IA or UX designer join you as an observer or take notes on interaction issues that inevitably come up during content testing – users don’t differentiate between the two disciplines (obviously) – for them it’s all the same experience. So even if asked to read something for comprehension, they’ll see link labels they don’t understand, or comment on colors that are hard to see, etc. – nice tidbits for IA/UX enhancements.

    3) Maybe goes without saying, but when you’re testing comprehension of existing content, you’ll probably also identify missing content – info. your test subjects say the need but can’t find. Specifically ask subjects about that if they don’t bring it up themselves – quick ‘n dirty “content gap analysis”!

    Copy & paste the code below to embed this comment.
  22. Recently you mentioned this article in a tweet. I’m pretty sure I read it when it was new, but it was worth the time to read it again.

    I especially appreciate the advice for when to use what kind of testing. Working in a setting in which my audience varies among ditch diggers, petrochemical engineers, legislators, “the general public,” and others, it’s hard to know when well-structured copy is readable enough.

    As you’ve noted, usability testing is one way to assess that, but sometimes it’s hard to convince observers that the problem is the copy, not the lack of attractive images. The Cloze test, which I set aside as a novelty when I learned of it years ago, seems well suited to just that situation.

    Next time I run into that, I’ll be sure to give it a try. But I’ll also not wait for those circumstances to arise before I do.

    Copy & paste the code below to embed this comment.