Leading and Trailing Measurements: Applying Metrics to your Content

Home/Publications/CIDM eNews/CIDM eNews 02.16/Leading and Trailing Measurements: Applying Metrics to your Content

JoAnn Hackos, Comtech Services, Inc.

Last week, January 26 and 27, 2016, marked the second online conference offered by CIDM. One hundred members and non-members signed up, with many in regular attendance at the sessions and others intending to listen to the recordings. If you missed it, you missed a really exciting and worthwhile event.

On the first day, Dawn Stevens, Comtech Services, presented recommendations for using a Balanced Scorecard to assess the immediate and long-term potential of an information-development team. Mike Eleder of Nokia Networks explained exactly how to measure content improvement using six sigma methods.  Rashmi Ramaswamy and Liane Ghosh, Innovatia, told us how to use innovative measure to improve content design and delivery. Teresa Acob, Ericsson, explained how her team fixed a very sick project.

On day two, Abi Bettle, IBM, recounted techniques to measure collaboration. Michele Marques and Paresh Naik , BMC Software, gave a very thorough and detailed analysis of Google Analytics. Scott Carothers, Kinetic theTechnologyAgency, showed us how to measure the translation process. Amber Swope explained how to measure the consistency of content across deliverables. IXIASOFT’s Keith Schengili-Roberts showed us how he measures ROI on DITA projects, and Fiona Hanington, Ericsson, described the research conducted to learn if customers find it easier to locate the content they need if it is delivered in topics rather than books.

I began the two-day session with an introduction to quality measurements. You can read more in my article, “Establishing Quality and Usability Benchmarks for Information Products,” which served as a basis for my talk. You can also view my 2005 article on Quality Management in the CIDM article archives, along with seven others in an eight-part series on the Information Process Maturity Model (IPMM), including an article on Quality Assurance in general.

It might also be interesting to know that the ISO standards committee that I participate in has been updating the Review and Testing standard for technical documentation. The new version of ISO 26513, Systems and software engineering — Requirements for testers and reviewers of user documentation, should become available in late 2016. Of course, as information-development professionals have long recognized, review and testing is one of the essential hallmarks of quality.

During my January presentation, I mentioned a concept that information developers should carefully consider as you think about measuring quality or measure any other characteristic of your process and product. That concept involves the distinction between leading and trailing indicators of quality.

Of course, leading and trailing indicators are not limited to quality assessments. In the world of economics, a leading indicator is used to try to predict whether the economy might be growing or heading toward a recession. There is a theory, for example, that when lipstick purchases go up, the stock market is heading for a decline.

A trailing or lagging indicator tells us about something that has already happened. For example, there is a theory that aspirin sales increase when the stock market has already gone into a decline.

In information development, we need to pay attention and carefully measure both leading and trailing indicators of the quality of our information. We tend to favor leading indicators, or indicators we hope are leading, because they are the easiest to administer. We ensure that topics follow our DITA Information Model or our style guide. We might have editorial checks that cover writing and layout. We might ensure that procedures are tested against the product so that they work correctly. We might even conduct usability tests of sample content to learn if users can successfully perform procedures or understand conceptual information.

Yet, most leading indicators cannot tell us if we have been successful in meeting customer information needs. For trailing indicators, we have traditionally asked customers what they thought. We might include feedback comments with our topics, allowing customers to register their satisfaction with a topic and add a comment. We might conduct a customer survey about the content, asking customers if they found the content helpful and usable.

Until we began posting content on the web, we had few opportunities to collect feedback from customers. Now we can analyze data about customer behavior, discovering, for example, how they search for information or if they spend time reading or downloading information they have found.

Online communication channels also provide us with more ways to interact directly with customers, to observe their behaviors and their thoughts as they try to find and use information online. We can learn, for example, that our procedures are too long and wordy, interfering with successful completion of a task.  Or, we might learn that we are using vocabulary that our customers don’t understand or terminology that hinders customers’ ability to find the information they need.

Whenever we learn that quality has been compromised, we are able, given online content delivery, to fix problems as soon as we identify them. Unfortunately, in our studies of information developers, we find too often that problems rarely get fixed in a timely manner. Many organizations simply wait for the next release, letting problems linger and continue to annoy customers.

I recommend that you carefully assess the activities you have in place in your information-development process to assess quality. Be certain that you have both leading and trailing indicators in place and you have an action plan to ensure that quality defects are corrected. Avoid the temptation to let the pressures of the “next” release let known problems continue or problems remain unidentified. Quality, remember, is essential.

The recordings of the online conference—Measure Anything will be available by February 22, 2016. You must register and pay the fee of $375 for the recordings if you are not a CIDM departmental member.

We use cookies to monitor the traffic on this web site in order to provide the best experience possible. By continuing to use this site you are consenting to this practice. | Close