CIDM

June 2005


In Their Own Words


CIDMIconNewsletter Michael J. Miller, PhD, LearnSpire, LLC

Information architects, technical writers, and content managers often make assumptions that category labels will be understandable to users. But do users always understand what information may be found within each category as well as the distinctions between categories?

This article highlights a series of usability evaluations of a customer support web site for an organization that produces electronic design automation tools. As part of a redesign effort for a customer support web site, usability tests were conducted to determine the ease and effectiveness with which users could locate information. Although the purpose of the site redesign usability study was not to assess the understandability of the document type categories, it readily became apparent that they were leading to considerable problems for users. For example, during one test session, 23 percent of all failures were caused by participants selecting the wrong category to find information. Accordingly, it was decided that a separate usability study, one which focused exclusively on the document type category labels, should be conducted. This study would seek to determine what users expected each document type to represent and why the current types were leading to confusion.

Historical Background on Document Types

A number of departments, namely tech pubs, marketing, and customer support produced document types for the customer support web site. The diversity of input became problematic because the category names for these document types were confusing and redundant. For instance, individuals in customer support produced documents called “Solutions” that described how to solve problems. These solutions came from customers that had called in to offer them. Similarly, tech pubs produced a document labeled “Known Problems and Solutions” which described software bugs and how to work around them. Exacerbating this issue were 18 different document types, 13 of which were produced by tech pubs. Having to discriminate between 18 different categories would likely cause difficulties to anyone. The 18 document types, and their definitions, may be seen in Table 1. Document types marked with an asterisk (*) were produced by groups other than tech pubs.

Generating this comprehensive list proved quite insightful for all parties involved in writing documents and conducting the usability evaluations. Besides bringing to light potentially redundant documents, the team discovered that the existence of some documents was due to outdated practices. For example, “Release Alerts” represented photocopied pages that were created at the end of a product release cycle following the submission of the reference and users guide to the printer. However, since all documentation was produced online and all documents had the same final deadline, no production-related reason existed to create “Release Alerts” as separate documents. Despite this fact and that redundancies may exist, the team decided to test each current document as it was originally defined to determine if keeping that information separate might be useful to customers.

Objectives of Testing

The primary objectives of testing were to determine if

  • the document type category names were understandable to users
  • the document types should be renamed or regrouped to make them more meaningful and useful to end users

Methodology: Stage One

The research team collected representative samples of all of the various document types, removed the titles and any other identifying information from them, and recruited five customers and five internal support engineers to take part in the testing. Working individually, each participant was given all of the sample document types (one sample for each type) and asked to describe what they thought each document was used for, how they would group or categorize the documents, and what label or heading would best represent each category. This technique represented a modified card sort procedure since participants were not given category names but were asked to denote names themselves. Since the sample documents all covered product-specific information, the team gathered multiple samples of several of the document types so that customers of different product lines could all participate in the tests with information related at least in some way to the products those customers used.

June 0523

Table 1. Document Types and Definitions

The two most salient results from the stage one evaluation were that:

  • Eight of ten participants sorted by product first as opposed to task. That is, participants expected to see a list of products at the onset with each of the different document types presented under each product. “First products, then document types,” said one participant. Another stated that she would prefer to “go to a main page that lists products and maybe even products by version, and then lists document types underneath.” As one participant explained, categorizing by product reduced the number of documents in any search for information. “Once you go down one subdirectory, you should be narrowing down your search.”
  • Users do not differentiate as finely as writers do. Users tended to articulate much broader categories of information than were being presented to them. They also found it difficult to identify types that a writer of a document might think of as unique. For example, tutorials and user guides were often believed to be the same document type. While three participants described a difference between “user guide” and “reference,” three others used the terms interchangeably. “This is tricky,” said one participant.

June 0519

Table 2. Suggested Categories

The team analyzed the feedback received from test participants and found that users all mentioned some form of the four categories as presented in Table 2.

Methodology: Stage Two

Much was learned from stage one of testing. The team now knew that users did not make as fine distinctions between document type categories as did writers and grouped them differently. However, the team felt that it would be prudent to determine if the four category names above would be readily understood and used in a similar fashion by another set of users. Accordingly, a web-based survey was developed for this stage and distributed to customers across the world that used the company’s support services. A total of 87 participants (49 in North America, 34 in Europe, and 4 in Asia) took part in the online survey.

Survey respondents completed a series of multiple-choice questions. Each question listed one document type definition and asked participants to assign it one of the four categories that resulted from the previous test or to write in an appropriate category. The respondents could pick only one category for any document type definition. At the end of the survey, respondents could suggest any modifications or additions to the set of categories listed in the survey.

June 0522

Table 3. Document Types Demonstrating High Levels of Agreement

The results of this study were quite promising; the level of agreement for 13 of the 18 types reached 70 percent or greater. Specifically, 70 percent or more of the respondents chose the same category for a given document type. Table 3 shows these document types.

Document types demonstrating lower levels of agreement are shown in Table 4. The percentage values were derived from all responses for that particular document type. Categories receiving less than 10 percent of responses are not included.

Of the document types listed in Table 4, “Installation and Configuration” and “New or Changed Features” were most frequently selected, thus demonstrating that respondents viewed documents used to set up and discover changes with the software to be related.

These results also demonstrated the impact of learning and familiarity with specific document types. For instance, of the 19 users who placed “Tutorial” in the “Other” category, half suggested a title of “Tutorial.” Additionally, to the question asking if any category were missing, 10 respondents wrote “Tutorial.” This suggests that a tutorial category was expected by users. Further, of the 19 respondents who placed “Known Problems and Solutions” in the “Other” category, 8 suggested calling the category “Known Problems and Solutions” while 6 other respondents suggested “Release Notes.” Similarly, of the additional category names offered by respondents, four of five were headings currently or previously in use by the company. As one participant stated in stage one of testing, “you get used to some naming conventions, I don’t like to see

[them] change … we take a lot of time to get used to them.”

June 0520

Table 4. Document Types Demonstrating Lower Levels of Agreement

Recommendations

From the results of the evaluations, the test team offered the following recommendations:

  • Use the four suggested categories to group documents.
  • Include the five document types that had low levels of agreement in multiple categories.
  • Provide descriptive text for the category names to facilitate understanding.
  • Include alternate or older type names such as “Release Notes” in any descriptive text.

Lessons Learned

What strategies should writers, information architects, and content managers follow to increase the likelihood of user success in locating information? The results of this research suggest the following:

  • Know your users and determine how they prefer to browse for information. In this case, users wanted to be able to see product names first before being confronted with document types.
  • Make certain that category names are known to be understandable to users. Additionally, provide clear descriptions or definitions of categories to ensure understanding.
  • Before deciding to separate information, evaluate whether users view the information as discrete. Will having separate documents make sense to users, or do they feel that the documents should be grouped together? Remember that users often don’t view different document types at the same level of granularity as do members of the organization producing them.
  • Strive for consistency in the formatting, use of titles, and descriptions provided for different document types. Consistency is especially important when individuals representing different groups within the organization are producing documents.

While these lessons learned may appear to be rather self-evident, consider that this study was conducted at a company that routinely conducted usability evaluations of its documentation systems yet somehow overlooked the impact that category names had on users. Investing the resources to determine user preferences, needs, and expectations will help ensure that the documents your organization produces will represent a usable and effective system. Finally, it never hurts to categorize information in the users’ own words. CIDMIconNewsletter

About the Author

June 0516