Susan Harkus

In a recent program on Australian National radio, an environmental scientist, Tony Recsel, criticised two advocates of high-density housing who blamed the community for the resulting increase in traffic congestion and pollution.

Recsel cited their insistence that “the resulting traffic congestion and pollution are just too bad; people should simply not drive when there is congestion!” (Planning Our Suburbs—Sound Policies Or Fads. Sunday 1 Feb 2004.

His response to this attitude was emphatic. “Planners surely should be more pragmatic. What’s the use of dogmatically basing your planning on what people ought to do, when you know very well that they won’t do it?”

This is a conundrum that faces information developers: the chasm between how we think people should use information and how they do—the “Why DON’T they know. It’s in the user guide” syndrome.

“Learners… don’t seem to appreciate overviews, reviews, and previews, they want to do their work. They come to the learning task with a personal agenda of goals and concerns that can structure their use of training materials.” (John M. Carroll, 1990. The Nurnberg Funnel. Massachusetts: MIT Press, p. 26.)

Candice Harp’s research, published in 1996, is another reality check—and no less intimidating. Her study involved 263 participants who were interviewed about their strategies for learning “all types and brands of office automation software.”

The most useful learning strategy was “experimenting.” Reference manuals came in 9th, third-party books at 13th, and using online Help at 14th. (“Winging It,” Candice Harp,Computerworld 30 (43), pp. 107–109, October 21, 1996)

I am sceptical about survey results but most of us will see OURSELVES in this survey, and in Recsel’s and Carroll’s comments. So often, WE don’t do as we’re supposed to either!

How then do we respond, we, who are up to our armpits in the business of writing?

In a recent project, our team faced this very challenge. My client was commercialising a Web-based application that enabled business teams to create and publish important corporate documents on the Web and in hardcopy.

We knew that our users would be focused on their business tasks, not the software. We looked “pragmatically” at the real-world contexts of users such as the senior executive who signed off on publications or the marketing manager co-ordinating document development as an additional, “administrative” task.

Our documentation solution was not selected on the basis of the information that each type of user needed, though that was certainly important. We needed to meet our users in their performance context and to design our information to match that context.

We considered a number of user contexts and set a simple objective for the user guide—get users to their starting point in the application. After that, users could learn what they needed by exploring, supported if they wished, by targeted online information.

We created a simple checklist for each user type. Checklists had two columns—Task and How? Some checklists had several rows (tasks) but at least one checklist had only one task. All checklists were less than a page and included lots of white space, which we hoped would suggest low cognitive effort.

The How? column provided only essential, get-started guidance. For example, the coordinator’s checklist had a task of Setting up a senior executive for the final review and a How? of Print the reviewer’s checklist and pass to the reviewer.

We included a colored, boxed section above each checklist. The boxed section explained the two types of Help: “form Help” accessed from a Help icon and mini-Help text beside specific fields. All Help delivered make-sense-in-the-business-context information.

Yes, we recognised that users who were focused on “doing” would be likely to skip the “helpful” information at the top of the page, just as they skip instructions at the top of online and hardcopy forms… so we planned a tipping strategy.

We included an unsettler heading for the top-of-page section—Will you know what to do? We made the two headings in the section the first layer of information—Watch for the Help icon

[picture] and Watch for mini-Help.

The messages in those three headings might not be read explicitly, but we expected they would be absorbed in some way.

We hoped that if the users hit a roadblock in the application, prior subliminal processing of those messages would encourage the users either to read their checklist to learn how to use the online resources or simply to look for the online resources.

The application and its minimal documentation are in the final phase of production. We want our strategy to support at least 80 to 90 percent of users, and we will soon know if the strategy has failed when the telephone support levels soar.

Our “minimal” manual can be seen as a one-off solution for a particular user performance problem, but the process can be applied more widely.

  1. We looked at our users in the context of their business tasks and their agenda: more work, time-pressure, deliverable with high corporate visibility, dependence on other’s performance, and even corporate politics.
  2. We designed the solution to deliver appropriate information to the user contexts. “Appropriate” meant being minimal but also meeting each user’s priority.
  3. We looked to our usability work on the application to support users as they performed their real-world business activities.
  4. We wrote Help for key fields only and the Help information mapped field input meaningfully to business decisions.
  5. We wrote Help for some forms—those where user “business” decisions were complex or challenging.
  6. The team’s creative designer developed a Help icon and a mini-Help presentation that unsettled the eye to increase their visibility.

What we did follows the path of the many developers who have applied the principles of minimalism.

For example, in the early 90s, JoAnn Hackos and her team developed an introductory guide for expert users of statistical analysis software. The approach anticipated the way expert users would want to work and supported experimentation and problem avoidance. (Choosing a Minimalist Approach for Expert Users, JoAnn T. Hackos. Minimalism Beyond the Nurnberg Funnel. Massachusetts: MIT Press, pp 149–177)

A minimalist approach takes up the challenge that Michel Foucault threw down to authors in Death of the Author. It was not that Foucault thought authors irrelevant. Rather, he recognised that the reader’s context (Foucault’s discursive environment) will always have a “major impact on how [readers] interpret the author’s work.” (Foucault: Approaches to Understanding the Text in Context, Michael Olsson. keyword. Nov. 1997)

When we design information, we need to leave knowledge-creation space for our information users. Our task is to determine how much space.

The user’s real-world context and agenda are probably good starting points.