<lcObjective>Be able to write well-formed, measurable objectives.</lcObjective>

Home/Publications/CIDM eNews/Information Management News 04.13/<lcObjective>Be able to write well-formed, measurable objectives.</lcObjective>

Dawn Stevens, Comtech Services, Inc.

With the introduction of the Learning and Training specialization for DITA 1.2, technical writers now have access to an entire set of elements that many have no training on using. Just as desktop publishing introduced the “ransom note” effect to document design, the Learning and Training specialization opens the door to misunderstood and poorly used elements. Although these elements might seem exclusive to training departments moving to DITA, effective collaboration between technical documentation and training can be encouraged and enhanced if both groups share a common understanding of how these elements can and should be used. Furthermore, a basic understanding of the instructional design theory behind these elements, in particular those dealing with objectives, can also improve all topic types, whether or not they are specifically developed for training materials.

At the heart of all training content is the instructional objective. DITA 1.2 introduces the <lcObjective> element, along with other elements that allow you to group and introduce the objectives and associate them with specific training needs. Objectives can be included in planning documents, learning overviews, and summaries and be directly related to specific content and assessments. The fact that you can include the objective elements in all learning and training content implies that objectives are central to training content. The instructor’s “golden triangle”—the penultimate goal of instruction—is that need, content, and assessment all align. The objective defines the training need, content is developed to address the objective, and assessments evaluate whether the objective is met. Students would be dissatisfied with a course that promised to teach acrylic painting, but in the class students worked with watercolors, and their final project was done in magic markers. The objective would not be in sync with the content or the evaluation. We laugh at such an extreme example, but the reality is that objective/content mismatch happens frequently in training and in documentation.

Although technical writers do not typically write formal objectives for their documentation, it can be argued that each topic still has an objective …a reason for being. We define a topic as standalone content that answers a single question. It is not a stretch to reword that definition as standalone content that addresses a single objective. In fact, this slight adjustment to the definition can better help to align documentation and training and facilitate topic reuse between them, and it can help the technical writer eliminate irrelevant content and better focus the content on the needs of the user. By defining objectives for individual topics, we know where we intend to go and can focus the content on the end result. If the content doesn’t directly support the objective, it doesn’t belong in the topic.

That’s a lot riding on the back of a single objective, which explains the emphasis that instructional design places on writing well-formed, unambiguous objectives. An objective needs to be written so that all end users looking for a specific piece of information will identify the same topic as containing that information. It needs to be specific enough that all writers who might be assigned to write about it would choose the same information to include. It needs to be clear enough that all people judging whether or not someone has met the objective would come to the same conclusion every time.

Instructional design theory teaches that a well-formed objective contains three distinct parts: a condition, a behavior, and a standard. Each of these plays an important role in defining the overall scope of the content that will support the objective.

Conditions: Under what circumstances must the behavior be performed?

In an objective, the condition answers questions such as:

  • Will the user be given any specific tools or data? For example:
    • Given the Periodic Table…
    • With only a hammer and screwdriver…
    • Referencing the user guide as needed…
  • Will the action need to be performed in a specific environment? For example:
    • In a noisy, crowded room…
    • In subzero temperatures…
    • With minimal lighting…
  • Will the user be deprived of any resources?
    • Without supervision…
    • Without reference to a dictionary…
    • From memory…

Conditions can help the technical writer answer the question, “should I include this information in the topic or reference it?” For example, if the condition says that the user will be given a table of data, the expectation is set that the user needs that information to complete the task. Instead of referencing the table from the task topic, the writer should include the information in the topic so that the user is not forced to go somewhere else to complete the task. However, if the condition says only that the user may reference the table if needed, deprives the user of the information, or is silent on the matter, we can assume that access is not essential and the information can simply be referenced. The user can “opt in” to that information, if needed.

Behaviors: What, exactly, must the user know or do?

Most energy in writing an objective is spent describing the behavior. Special attention is needed in simply choosing the verb. The key to this selection is to remember that the behavior must be both observable and measurable. A common mistake is to use verbs such as “understand” or “know.” These broad verbs are perhaps useful in a goal statement, but objectives need to be much more specific. How will the user demonstrate understanding or knowledge? What information do we need to provide to promote understanding or knowledge? If too much is left to interpretation, the content can become unfocused, either too general to help users or too cluttered with information for users to find the specific information they need.

When selecting a verb, imagine how you would evaluate that action. For example, instead of writing “understand the functions of the system,” perhaps the users should be able to “list the functions of the system” or “describe the functions” or perhaps “select the appropriate function for their task.” The technical writer now knows what to include in the topic—is it simply a list of the functions or does it include descriptions of the functions? And within those descriptions, must it provide guidelines for when each is used?

The way in which you expect the user to demonstrate mastery of the objective relates to the “domain” and level of mastery required. Instructional designers divide objectives into four domains:

  • The cognitive domain deals with mental skills or knowledge.
  • The affective domain addresses feelings and emotions.
  • The psychomotor domain includes manual or physical tasks.
  • The social domain covers interactions with others, including collaboration and management.

The verbs you choose should reflect the domain that you are trying to affect. For example, in the affective domain, you might include verbs such as accepts, believes, questions, or defends. These verbs are ways that users might express a change in feeling or emotion. In the psychomotor domain, verbs reflect more physical actions, such as assembles, welds, fixes, or measures. And in the cognitive domain, we use verbs related to intellectual abilities and skills, including computes, recognizes, predicts, and uses. In technical writing, we most often write in the cognitive domain. We aren’t often called upon to affect emotions, which is typically left to the marketing department, and psychomotor skills are difficult to teach in a written format.

Each domain is further broken down into degrees of difficulty. You can’t hang around an instructional designer very long without hearing about Bloom’s taxonomy. This taxonomy breaks the cognitive domain into six major categories that reflect differing degrees of mastery. Once again, the verbs chosen reflect the level of mastery you expect:

  • Knowledge—Recall data or information. Typical verbs: describes, identifies, lists, matches, recognizes. Example: Recite the Pledge of Allegiance.
  • Comprehension—Understand the meaning and interpretation of instructions and problems. Typical verbs: distinguishes, interprets, summarizes, paraphrases, explains. Example: Explains in their own words how to bake a cake.
  • Application—Applies what was learned into novel situations. Typical verbs: computes, modifies, solves, uses, operates. Example: Uses statistics to predict the likelihood of an event occurring.
  • Analysis—Separates material or concepts into component parts to understand the structure. Typical verbs: analyzes, compares, differentiates, infers, contrasts. Example: Troubleshoots a piece of equipment to locate the source of the problem.
  • Synthesis—Builds a structure or pattern from diverse elements. Typical verbs: categorizes, compiles, designs, plans, revises. Example: Revises a process to increase productivity.
  • Evaluation—Makes judgments about the value of ideas or materials. Typical verbs: appraises, critiques, justifies, evaluates. Example: Selects the most effective solution.

As the degree of difficulty increases, the amount of supporting content typically increases as well. However, it is unlikely that a single topic would be expected to bring its reader from no knowledge to a higher level such as synthesis or evaluation. The path to full mastery will require multiple topics with different objectives that reflect each level of difficulty. Topics at the upper levels assume that the previous levels have been reached. Although each topic stands alone for its specific level of mastery, all topics may be related and required to bring the user to the end result required.

Standard: To what criteria must the behavior be performed?

The specified standard in an objective defines how an observer will know it was performed appropriately—how well must a users perform to prove they’ve met the objective? Typical measurements might be related to

  • Speed. For example:
    • Within 10 minutes
    • At least 30 words per minute
  • Quality or accuracy. For example:
    • To two significant figures
    • With less than 3 typos
    • With no errors
  • Frequency. For example:
    • Every hour
    • At least 10 times

Although the standard is more difficult to affect by topic content alone, it can again provide clues for how to approach the documentation. For example, if the standard is to perform a task “with no errors,” the technical writer should ensure that the document prominently warns users about potential errors before they occur. On the other hand, if the standard states to perform a task “with no unrecoverable errors,” the documentation should provide troubleshooting and recovery information. Measurements of speed might guide decisions about the complexity and length of a topic. For example, if a task should be completed within two minutes, and it takes 10 minutes to read the topic, there may be a disconnect.

Given <condition>, the learner will be able to <behavior> to <standard>

When the three parts of an objective are put together into a comprehensive whole, there is little doubt about the appropriate content to include. Instructional designers don’t start writing learning content until they have clearly identified the objectives of that content. From those well-formed, measurable objectives, they are able to select the appropriate instructional strategies that will help their learners reach their goals and only then create the relevant content. Technical writers can and should follow this lead in documentation as well. By first identifying the objective of the topic, writers need only evaluate if the topic content directly supports fulfilling the objective.

Unfortunately, the use of <lcObjective> elements does not enable writers to directly link an objective to a standard DITA topic. <lcObjective> elements are allowed within <learningPlan>, <learningContent>, <learningOverview>, <learningSummary>, and <learningAssessment> topic types, but not <task>, <concept>, or <reference>. Nevertheless, good planning always precedes good writing, and there is no restriction that <learningPlan> be applied solely to learning content. Technical writers should take advantage of the new learning elements introduced in DITA 1.2 to improve the documentation as well as to collaborate with their training colleagues. Writing sound instructional objectives is the first step.

We use cookies to monitor the traffic on this web site in order to provide the best experience possible. By continuing to use this site you are consenting to this practice. | Close