From the Director
Measuring the Intangibles: A measurement extravaganza at Best Practices 2011
In his keynote address at the 2011 Best Practices conference, Douglas Hubbard, author of How to Measure Anything, reminded us that we must be careful about measurements. Most of us tend to spend time measuring what we already know rather than measuring what we don’t know. We must, he asserted, measure what matters most to our decision-making and the decision-making of our management.
We use measurements to make better decisions by reducing our uncertainty. We want to make better “bets” on the future. To do so, we need to focus on the measurements that have the highest payoff and give us the most useful information.
Hubbard, who spent considerable time talking with the Best Practices participants about their work, noted that the measurements in our information-development world are not so different from measurements in other business areas. Like everyone else, we can focus on gathering information that helps us become less uncertain.
He pointed out his favorite acronym: COM.
- C for Concept
- O for Object
- M for Method
Concept means that we use measurements to reduce our uncertainty in making a decision. Object means that we want to define clearly what we are trying to measure. For example, we may want to decide if we have enough resources to develop information for a new product in advance of the product release. In short, can we make the deadline and produce information of the required quality to meet customer needs? Or, we may want to know if moving to the DITA standard will increase the productivity of our staff so that we can do more with the existing resources.
Method points to the measurements that will help us reduce the uncertainty behind our decisions. Hubbard referenced many of the methods he discusses in his book, methods like the Rule of Five, which means that we can get useful data about something by making just five observations. We have always used the Rule of Five in customer studies. If we study five customers and get the same results from each one, we can be reasonably assured that we have useful data about all the customers. We are 90% confident that we are learning something that is representative of the customer population as a whole.
Following on Hubbard’s keynote were many speakers who described how they are using measurements to improve the quality of their information development efforts. Bob Lee, from Symantec, described how he is measuring the value that customers find in their website content. He noted that just 10% of the content on the website gets 80% of the customer hits. As a result, Symantec identified redundant resources of information and got down to just a single source. They found that connecting the content to their Support Forum encouraged customers to look at the content and reduced the calls to support.
Lee reported that their new How To … articles have attracted considerable customer attention. One new article scored third highest in hits. Three of the new articles were among the top six hits. Hits on the articles targeted for search engine optimization rose by 344%, compared with non-targeted articles, which rose by only 160%. Calls to support for basic How To … information decreased significantly.
The reuse reported by Daphne Walmer of Medtronic, a medical device company, is nothing short of spectacular. Her organization produces 1,035 different manuals, with 29 million English words in the source documents. On average, 96% percent of the modules and 94% of the words are reused across the product lines. Walmer reported on one recent product family introduction, which required 56 reference manuals that were 100% rewritten. The first manual contained 95,000 new words. Subsequent manuals, each about 500 pages, contained only 55,000 new words.
She concluded that reuse yields massive benefits (including reduced cost and cycle time with increased quality) for both authoring and translating. Although Medtronic translation volume (measured by number of pages translated) has increased significantly, the cost per page has continued to decrease.
Chona Shumate of Cymer, a semiconductor company, reported on the customer study she conducted that showed that field services engineers were taking far too much time to find the information they needed to troubleshoot and repair customer equipment. The data from the survey has led them to consolidate information in a unified new support Forum. Now they are measuring the results to learn if search time has indeed decreased.
At a luncheon discussion of resource requirements, Shumate described the measurements she used to predict the required time to complete the documentation for a new product. She charted eight years of data, tracing the time required for a finished procedure to go from outline to final draft. She measured the average number of procedures per product and the average number of finished procedures per week. These data allowed her to estimate accurately the number of procedures and time required to complete for the new product, even though the new product was many times more complex than any previous products.
Measuring the time saved by using information-development professionals rather than system design engineers to develop architectural documentation was Bob Beims of Freescale Semiconductor. He reported that the writers took one tenth of the time required by the engineers in the product life cycle. Beims used a Critical Chain Project Management analysis to involve information developers earlier in the product development life cycle, giving them an opportunity to influence the success of the product and to shorten the time required to produce the information. With the process improvements in place, design data is available six to nine months earlier than in their legacy process. Automated conversion of software enablement tools has reduced a process that took two to three person-months to less than three days.
In the last presentation of the conference opening day, Andrea Smith of Altera explained their Closed Loop process to improve the searchability of content on the company’s website. The team improved metadata with the help of Google Ad Words and Tag Crowd tools. As a result, Altera saw a 14% reduction in customer support tickets. Their program, called Search before Submit, encourages customers to search for existing information before contacting support. The improvement in the content searchability means that their searches are frequently successful.
Clearly, after only the first day of the two and a half day conference, CIDM members demonstrated that measurements are a critical part of the information-development process today. Throughout the conference, we heard from CIDM members reporting on their successes with customer wikis, new media, collaboration between documentation and training, and quality.
To wrap up the conference, Mike Eleder and Volker Oemisch of Alcatel-Lucent explained that we can make metrics count. They point out that defining what we mean by a measurement can be challenging. For example, they tackle the issue of quality in information, pointing out that we must determine the attributes of quality that we want to measure. If we intend quality to mean accuracy, we can collect data on the average number of defects per topic or the number of customer change requests per topic. If we intend quality to mean usability, we can measure the results of usability tests or survey customer satisfaction with the usefulness of the information.
Finally, they point to the importance of measuring trends in the information we collect rather than points in time. If we can show steadily decreasing time to produce topics, we demonstrate an increase in productivity.
And we can learn a great deal from Doug Hubbard’s ideas about the processes for measuring what we thought were “intangibles.” By following Hubbard’s process, we can successfully measure issues that we thought were not measurable before.