Dawn Stevens, Comtech Services, Inc.
The question of separating or combining information development and instructional design departments often results in strong opinions and heated debates at all levels in the departments and in the organization as a whole. Even when the majority acknowledges the benefits of working together, decisions concerning organizational structure are often outside their control. Circumstances may make it hard, if not impossible, to collaborate in an environment of different processes, priorities, and deadlines. In such situations, even the most passionate believers in the benefits of working together ultimately return to their separate corners and hope for a more enlightened future.
Collaboration between information development and instructional design benefits from shared, reusable content that results in shorter development cycles. However, end users realize even greater benefits when documentation is developed as a learning product. When information developers recognize that information plays a role in teaching customers about the product, rather than just telling customers about the product, the information is more focused on user needs and expectations than on the product itself. Users are better able to find the information they need, apply it to the situations in which they find themselves, and internalize it so they don’t have to interrupt their workflow to refer to the documentation.
Even when circumstances keep information developers from partnering with their sister instructional designers, nothing stops writers from applying sound instructional design principles to their information-development process. In fact, taking such steps may be the catalyst that leads to a more collaborative relationship between information development and instructional design.
Although instructional design is a unique field, often requiring a university degree, even a few basic principles can make an impact on the usability and effectiveness of technical information development.
Start with User and Task Analysis
When designing training content, instructional designers start with a front- end analysis to identify what they need to teach and to whom they need to teach it. The focus of this analysis is to understand
- the desired end state—What does the audience need to know or do when the training is complete?
- the current state—What does the audience already know and what can they already do?
- the gap—What is the difference between the current state and the end state?
Further, careful user analysis not only reveals what the audience knows and is able to do, but also provides information about how that audience approaches learning. Instructional designers ensure that they take into account the personal, social, and cultural characteristics of their learners so that they select strategies that communicate most effectively.
Contrast this process with the typical front-end analysis for documentation; it often starts and ends with what the product can do. Information developers identify new and changed features and functions of the product and select strategies for best presenting those features and functions. They often do not formally consider what users want to accomplish with the product or what knowledge and skills they bring to the table. Information developers often assume that the desired end state is a thorough understanding of the entire product and that the current state is either extremely basic or incredibly expert. In reality, neither is true. Most users don’t want to know everything, just what they need to do their jobs. Depending on the product, many bring some relevant knowledge and experience to the task.
Consider, for example, a microwave oven. The oven’s user manual guides users button-by-button through the control panel. Reading through the entire manual gives the new owners a thorough review of everything they can do with their oven. However, most owners of microwave ovens never use all these features and yet remain perfectly happy with their purchase. Most just want to know how to defrost frozen food, reheat leftovers , and pop popcorn. In addition, the vast majority of new owners bring experience with a large variety of other microwave ovens and are likely able to figure out the basic functions without even cracking open the guide. What is the point, then, of creating a half-inch thick manual that is rarely, or never, opened? Who is it for, and is it really meeting their needs?
Information developers often focus on thoroughness and consistency, dutifully describing every screen and every field, no matter how obvious. When a trainer teaching a task encounters a screen that must be completed to continue the task, he or she might say, “Most of these fields are self-explanatory. Let’s just take a closer look at the couple that need special attention.” In user documentation, screen and field descriptions lack the context of a meaningful task. Users are told how to complete every field, which they can often figure out on their own, but they’re not told how doing so will help them achieve their goals, which they often cannot fathom.
By taking the time to perform a user and task analysis, information developers learn what users to do after reading the documentation and what they already know. They discover what is difficult for users and what is second nature. They understand the background and attitudes users bring to the documentation. With this information, information developers can avoid writing what no one reads and more importantly, avoid asking users to look for the information they need buried among information they don’t need.
Write Learning Objectives
With user and task analysis in hand, instructional designers write specific learning objectives that their training materials must address. These objectives translate the user and task analysis into measurable objectives that define the success, or failure, of a training program. Learning objectives keep the instructional designers focused on the purpose of their training. Not only are learners evaluated against those objectives during the training, but, as material is created, it is evaluated to ensure it supports the objectives. If content does not directly relate to an objective, it is considered unnecessary to the program and removed.
Learning objectives can perform a similar function for information development. A core principle of minimalism is to provide only content that users actually need. But writers often struggle to determine the information needs. With learning objectives identified, existing content can be evaluated for relevance and new content written with relevance in mind.
The key, however, is writing measurable objectives. For example, if asked for a list of objectives that users should be able to meet after reading the documentation, writers might suggest, “Understand the feature” or “Use the feature.” These general statements are not well-formed, measurable objectives,and therefore useless for evaluating content. What specifically constitutes “understanding the feature”? How would users demonstrate that understanding? What specific facts do they need to know? In what context is the audience “using the feature”? What end result do they want to achieve when they are done?
For example, if a user must understand the difference between AC and DC current before installing a product, the objective is not “Understand electrical theory” or even “Understand alternating and direct current.” Both are still too general–there’s no indication of what the user needs to understand. A more specific objective such as “Differentiate between AC and DC currents” or “Compare and contrast AC and DC currents” clearly defines what content needs to be presented. Information about the behaviors of the two types of current is appropriate.
In structured writing for topic-based documentation, the learning objective not only helps focus content, but also can help define the appropriate topic type and structure for the content. Objective verbs such as “define,” “differentiate,” “list,” and “describe” all point to conceptual information, while “do,” “complete,” and “perform” suggest task information. When combining objectives into single topics, similar verbs can help maintain the integrity of the information type. Task information is not buried in the midst of conceptual information, for example.
Finally, learning objectives can also make excellent short descriptions for the topic, stating clearly and succinctly what the reader can expect to find in the topic.
Show, Don’t Tell
A hallmark of good training is providing excellent, relevant examples. It’s not enough in training to simply explain how to accomplish a task, but each explanation is supplemented with demonstrations and hands-on experience. A good trainer, even when delivering a standard course, identifies the specific situations in which the students apply the principles of the course and change the examples accordingly each time the course is taught. Learners aren’t just told something will work for them; they see and experience it working during the training.
Unfortunately, there is often a dearth of meaningful examples in much documentation. Claiming that they can’t know every way a customer might want to use the product, information developers produce general content and expect the users to make the leap to apply the content to their specific situations. Even when the specific applications are known, the lack of examples is justified by a lack of development time or the increased expense of customizing the examples.
In the past, these justifications might have held more water. The cost of creating and printing unique solutions documents, geared toward specific industry applications of a product, is prohibitive. However, the ease with which content can now be reused across multiple publications and the fact that most documents are delivered electronically and rarely printed make it feasible to provide a wealth of examples to help users apply a product to their specific situations. Further, these examples need not be limited to words on a page, but can take advantage of any number of media options, such as animations and videos. Users need not attend a training course to see a demonstration or try a set of steps in a safe, isolated environment before going live. Documentation can provide these experiences as well, using the time saved providing only the information the users need in the first place.
Once again, documentation’s need to be consistent hinders the development and inclusion of solution-specific examples. If the product is widely used in a dozen industries, the assumption is that all dozen must have simultaneously released solutions documents. If there are fifty tasks that can be accomplished with the product, consistency dictates that each and every one of them must have an example, and, if one example uses video, they all must. Here again, take a cue from instructional design–all tasks are not created equal. Based on the results of their initial user and task analysis, instructional designers make intelligent choices about the depth and level of examples required for each task and even whether to include those tasks at all in the training. Yet, training is not criticized for being inconsistent because of these different approaches. Consistency should not be defined based on the type of content provided, but on whether the needs of the user were met in each case.
Evaluate and Redesign
The instructional design process includes an evaluation phase that feeds the next development cycle. Instructional designers look at the results achieved within their courses: Did students pass a final test or certification. Did supervisors report that after training their employees were better able to complete their jobs. Do students retain the information for a certain period of time? They ask instructors what types of questions were asked during the training but weren’t covered in the material. They review test results for commonly missed questions that might point to weaknesses in the content. During the next course update, these data inform what changes should be made to the content.
Far too often, information developers have no idea how their documentation has been received. Updates to the documentation cover new or changed features, not opportunities to improve the overall approach. Right or wrong, the style and choices made in version one are maintained for the entire life cycle of the product. Yet, the needs, expectations, and experience of users change between releases and over time. The documentation development process needs to recognize these changes and adjust from release to release. How long did it take for some companies to finally remove instructions for using a mouse from their documentation? In some cases, it took as long as it took for the product to be taken off the market, despite the fact that users found such instructions useless years before. Rather than simply getting longer as more of the same content is added, information developers need to adjust for changes in the audience; they need to take into account areas that were not well received in previous releases; and they need to revamp for changes in technology and accepted best practices.