CIDM

December 2001


From the Director


CIDMIconNewsletter JoAnn Hackos, CIDM Director

Dear Friends,

I want to thank once again all the brave souls who joined us on Cape Cod for the third annual Best Practices conference. Even though our number of attendees was lower than in previous years, the gathering was a great success. The smaller group allowed everyone to get to know nearly all the participants-forging friendships and networks for the future. The energy levels were high-the weather beautiful-the seafood exceptional. Through the next few months, we hope to bring some of the content and camaraderie to those unable to join us. Sorry we can’t bring the seafood!

The Balanced Scorecard provided a challenge: Use measurements to guide activities throughout the information-development life cycle. Use measurements to decide which activities are worthy of pursuit and which should be eliminated. Learn to speak the language of metrics, a language both understood and valued by senior business managers. Find new ways to communicate the value of information development to peers, colleagues, and superiors.

Providing us a fine start on Monday morning, Angela McAlister demonstrated how her knowledgebase development team provides an essential service to 3COM customers. Angela took us through the metrics she applies to judge the success of 3COM’s Knowledge Base (KB) Web site. We learned, for example, of the clear correlation between the quality customers perceive in the KB and the speed with which information moves from author to site. This measure, called “content turns,” tracks the number of days between creation and publication. As you can see, “content turns” is a measure of operational effectiveness, of how well our staff performs. The more up-to-date the content, Angela explained, the more satisfied customers are with the KB and the more likely they are to use it on a regular basis to solve problems.

By building cause-and-effect relationships in her business unit’s Balanced Scorecard, Angela is able to link operational effectiveness with customer satisfaction. In this case, satisfied customers use the KB to solve problems rather than calling customer support, a cost-reduction strategy.

By the way, Angela promises to develop a series of articles about financial measures for the 2002 Best Practices newsletter.

Palmer Pearson, senior manager at Cadence Design Systems, demonstrated that paying attention to customer calls provides a window of opportunity for information development. The Cadence goal was to reduce publications-related calls (“can’t find it, can’t understand it”) by 50 percent in 12 months. Measuring something that doesn’t happen is tricky. You have to look outside the box for evidence, as Palmer explained. Are call volumes going down? Is the ratio of phone versus electronic calls changing? Are the types of calls changing? What about anecdotal evidence?

Palmer’s team measured monthly the hit-rate on the FAQ pages of the Cadence customer-service Web site. They found that 25 percent of the FAQ hits answered the customers’ questions sufficiently. The quality of the information eliminated the customers’ need to talk to a live body.

As a result of better online information, the Cadence information developers hit their target six months early. Within six months, the call volume attributable to publications problems dropped from 14 percent of all calls to 6.8 percent. At approximately $500 per call, that amounted to a call avoidance in excess of $2.5 million. In addition, customer satisfaction numbers have continually improved. As a result of their work, the Cadence information developers, along with colleagues in customer support, received the Cadence 2001 Quality Award.

Among the many measurements that make up the Balanced Scorecard for her information-development team, Helen Sullivan, director at Nortel Networks, explained her commissioning readiness metric. Commissioning readiness is a customer measure with a 5-point rating scale.

5     I could set up a system easily with no help (unless something really weird happened).

4    I could set up a system with minimal assistance (such as a beeper number of a systems engineer).

3    I could set up a system, but I would need a systems engineer to be available in person when I needed one.

2    I could set up a system only if the systems engineer was with me the entire time.

1    It would take me a long time to learn how to set up a system even if the systems engineer was with me.

Each month, Helen’s team tracked the responses for each product line against an average baseline. Helen recommends that managers limit themselves to 9 to 16 measurements and plan to measure each month. Start with a skeletal Balanced Scorecard, even if you lack beginning metrics. Fill in with data that is accessible, adding one metric per month. Once a metric has been firmly established, hand off the responsibility for collecting data and monitoring that metric to a team member.

In the next few issues of Best Practices, I hope to report on more interesting and valuable measurements introduced by our wonderful group of skilled and experienced managers like Angela, Palmer, and Helen. If you have metrics that are effective in reporting to senior management, please send me a note. Next year, we hope to continue the nascent practice of Poster Sessions in which we report on new metrics that have a proven impact on establishing the value of technical information.

JoAnnHackos

We use cookies to monitor the traffic on this web site in order to provide the best experience possible. By continuing to use this site you are consenting to this practice. | Close