Testing Content

by Angela Colter

22 Reader Comments

Back to the Article
  1. Thanks for the excellent article. I’ll be using this as a resource to plan for testing my company’s content.

    I do have a question. You said:

    “Need to convince your boss to budget for content testing? Run it through a readability formula.”

    Forgive me if I’m being dense, but can you clarify how the readability formula reinforces the business case for testing? Do you mean by showing when the content is too complex for the average user to understand?

    Copy & paste the code below to embed this comment.
  2. Melanie, that’s exactly what I mean; showing stakeholders that the content is too complex for the average user to understand.

    The first question is, what does the average user understand? There is an oft-cited statistic that the average adult in the U.S. reads at or below an 8th-grade level. (The study that’s usually cited as the source, the National Adult Literacy Survey, never actually specifies a grade level. It states that nearly half of adults have literacy skills in the lowest two (out of 5) levels, meaning they have inadequate skills for coping with everyday tasks. But I digress.)

    If your organization has already specified a target reading level for it’s audience, use that.

    So the conversation might go something like this: “I’ve run some samples of this text through a readability formula, and it’s estimating that you’d need a 16th-grade education to understand it. That’s much higher than the 8th-grade reading level of the average adult. Maybe we should test this text with our users to see whether it really is a problem.”

    Copy & paste the code below to embed this comment.
  3. A very interesting article. Readablity is something that seems to fall right into a deep hoe with too many clients who want ‘sexy graphics’ and not much else. Rant over.

    My main point though is that using tools such as Kampyle can be very useful too for getting user feedback. It certainly helped us – though we don’t keep it on site all the time.

    Copy & paste the code below to embed this comment.
  4. I will be trying out Cloze testing on some current projects. Thanks for the tip on this. It’s simple, easy to set up and clear to score. Love it!

    Copy & paste the code below to embed this comment.
  5. Thanks for this post. As an SEO guy and editor, I feel like I need to start doing this stuff IMMEDIATELY.

    Good insight. Thanks.

    Copy & paste the code below to embed this comment.
  6. Thanks, Angela, your response to comments is almost as helpful as the article. I had just finished reading the comment “benefits motivate consumers (even b2b) more than features do” and thinking, gosh, that sounds really imp, what does that mean?

    Then you said “I like the distinction you draw between “benefit content” and “feature content” because you could ask different questions to assess whether people get each type. For features, you might ask “what” questions (What is it? What does it do?) where for benefits you’d ask “so what” questions (How would this help you?) I’d expect the benefit content questions might trigger more personal, colorful responses.” Ooooh!

    I often work with health websites for mainstream audiences that are written much too formally and to college readers, when the audience is largely low-literacy adults. Thank you for your language, and your concrete tests, for supervisors. REALLY helpful.

    Your writing is so understandable and actionable (and kind). I went to your blog to sign up and saw you have an post about low-literacy audiences, I can’t wait to read it, thanks!

    Copy & paste the code below to embed this comment.
  7. Jakob Nielsen’s Alertbox has a nice, short overview of the Cloze test. Check it out at http://www.useit.com/alertbox/cloze-test.html

    This is a sidebar to a larger article about Mobile Content Comprehension at http://www.useit.com/alertbox/mobile-content-comprehension.html

    I’ve always maintained that the Cloze test can be done independent of how the text appears in an interface, but Nielsen’s article suggests that the interface can have a profound effect on comprehension as indicated by the difference between Cloze test results using a desktop format and those using a mobile interface format.

    Copy & paste the code below to embed this comment.
  8. Keep on posting such valuable discussion. I am really acknowledged with such matters and wish to have more.

    Copy & paste the code below to embed this comment.
  9. I will certainly discuss this matter with my friends to find the logical conclusion. I will like to hear the further details to discuss the matter in deep.
    http://www.kapellohair.co.uk/hair-extensions-training-courses.php

    Copy & paste the code below to embed this comment.
  10. Hair Extensions
    http://www.kapellohair.co.uk/
    Thanks for writing such an informative article.Thats a marvelous feature of the web, but it will make testing quite more complicated,

    Copy & paste the code below to embed this comment.
  11. Thanks for some great ideas and suggestions, especially some testing tools and protocols I hadn’t heard of!  Three add-on comments/suggestions:

    1) Use an online survey to identify major content comprehension problem-areas you can then focus on (prioritize) during testing. As Content Strategist for a Fortune 500 telecomm company, I ran a survey that first segmented users by primary task purpose (Shop? Order? Learn?) and product interest (Internet? Phone? TV? etc.), then asked (among about a dozen other questions) “What information did you find confusing or hard to understand?”… followed by multiple choice answers. Results were a huge help designing a subsequent usability test that included comprehension of selected pages.

    2) During testing, either have an IA or UX designer join you as an observer or take notes on interaction issues that inevitably come up during content testing – users don’t differentiate between the two disciplines (obviously) – for them it’s all the same experience. So even if asked to read something for comprehension, they’ll see link labels they don’t understand, or comment on colors that are hard to see, etc. – nice tidbits for IA/UX enhancements.

    3) Maybe goes without saying, but when you’re testing comprehension of existing content, you’ll probably also identify missing content – info. your test subjects say the need but can’t find. Specifically ask subjects about that if they don’t bring it up themselves – quick ‘n dirty “content gap analysis”!

    Copy & paste the code below to embed this comment.
  12. Recently you mentioned this article in a tweet. I’m pretty sure I read it when it was new, but it was worth the time to read it again.

    I especially appreciate the advice for when to use what kind of testing. Working in a setting in which my audience varies among ditch diggers, petrochemical engineers, legislators, “the general public,” and others, it’s hard to know when well-structured copy is readable enough.

    As you’ve noted, usability testing is one way to assess that, but sometimes it’s hard to convince observers that the problem is the copy, not the lack of attractive images. The Cloze test, which I set aside as a novelty when I learned of it years ago, seems well suited to just that situation.

    Next time I run into that, I’ll be sure to give it a try. But I’ll also not wait for those circumstances to arise before I do.

    Copy & paste the code below to embed this comment.