Developing Metrics to Justify Resources

CIDM

August 2011


Developing Metrics to Justify Resources


CIDMIconNewsletterDonald Rohmer & Archana Maheshwari, Citrix Systems, Inc.

Although the economy seems to be recovering, budgets remain tight. Even existing resource levels can require justification if, for example, a replacement is needed for an employee lost to attrition. To meet this challenge and to provide ready answers to management’s questions about our department’s performance, we developed a process for gathering performance statistics and using them to predict our future need for resources. In addition to providing the numbers that are so important to management, the process facilitates recognition of employee accomplishments and identification of problems.

Our department documents a web application delivery appliance that has frequent software upgrades to provide new features, and the past year has seen the introduction of new models incorporating new technologies. Following are some of the management questions that illustrate the need to develop a process for gathering metrics:

  • Where has your time gone? The last software release did not include very many new features.
  • Why haven’t you dealt with more of the reported issues?
  • Can’t you work more closely with other departments, such as Customer Support?
  • Why have your special projects been delayed?

We therefore decided to focus our efforts on the development of a reliable, repeatable, flexible model for gathering productivity metrics. Quality metrics would also be useful, but they were not included in this project. Our basic strategy was to determine the rate at which existing resources can produce pages of output and to apply that data to calculate the level of resources required for keeping up with the release schedule.

Factors Affecting the Metrics

In developing our metrics, we had to consider the teams developing the content, the nature of the content, the state of the product, and a few other factors, as follows:

Teams

  • Knowledge and experience of the documentation team
  • Efficiency and communication skills of the development team providing information to the documentation team
  • Availability of subject-matter experts (SMEs) to answer questions from the documentation team. A complicating factor is that not all SMEs are equally cooperative.

Content

  • Complexity of the content
  • Number and complexity of illustrations (Our writers create their own illustrations, and skill levels vary)
  • Type of information available
  • Number of pages that will actually change

Product

  • Place of the product in its life cycle—new or in maintenance
  • Quantity and complexity of UI elements

Other

  • Learning curve for new applications and other tools, such as an XML editor. Also any equipment issues.
  • Time off, holidays, meetings, breaks, and other outside commitments

Developing the Metrics

To develop our metrics, we analyzed the data we had collected during work on previous deliverables and extrapolated to calculate our future need for resources. One complication was that our previous deliverables had been traditional guides and help topics, and we planned to convert that documentation to DITA for single sourcing. However, we had been preparing the writers for the transition, and we felt that they would only have to learn new tools, not a new way of writing.

But we could not base our metrics simply on our existing rate of producing new pages. Management might respond that we just need to work more efficiently. We first needed to define our standard processes, so that we would have documentation to justify our hours-per-page data. Additionally, a clear definition might reveal possible ways to make our processes more efficient.

Defining Standard Processes

To define our standard processes, we first defined our document development life cycle (DDLC), and then we used that definition as a framework for identifying the common types of work our writers have to perform. We also defined the tasks in terms of output. To make our metrics as accurate as possible in predicting the need for resources, we categorized the types of features that we are required to document.

Defining the DDLC

A writer does not simply sit at a computer and crank out pages of documentation. Document development must progress through the following stages:

  • Scoping
  • Audience analysis
  • Planning and scheduling
  • Research, organization, rough and polished drafts
  • Testing
  • Graphics
  • Three rounds of technical, project management, and editorial reviews
  • Production

Identifying Common Types of Work

Our writers have to find time for the following activities:

  • Document new features
  • Investigate reports of errors (“bugs”) in existing documentation and make necessary corrections
  • Perform the various tasks related to any project. We identified the following project-related tasks:
    • Project planning and management
    • Release management activities
    • Troubleshooting existing tools and templates
    • Gaining efficiency with the new tools
    • Peer reviews
    • Working with documentation and external teams
  • Write release notes for each new software release.
  • Address the list of special projects:
    • Convert documentation to DITA topics
    • Analyze and enhance the usability of our products, especially by participating in user interface (UI) reviews
    • Enhance the tips and error messages displayed in the UI
    • Define standard terminology and metadata
    • Collaborate with Customer Support
    • Define and execute multimedia projects
    • Participate in companywide documentation committees
  • Accommodate paid time off (PTO), holidays, and training

Categorizing the Features

Some features are more difficult to document than others. We defined three levels of complexity for the writing tasks.

A highly complex task has the following characteristics:

  • Complex concepts
  • Interrelationships among topics making the content difficult to organize
  • No source material available and SMEs are unresponsive
  • Complete lab setup needed for testing

A task of average complexity typically has the following characteristics:

  • Revised text, that is, not written from scratch
  • Conceptually not as complex as a highly complex task
  • Very little source material available

A low-complexity task has the following characteristics:

  • Mostly basic procedures (GUI and CLI) with brief introductions
  • Information readily available

To provide a guideline for estimating future projects, we also categorized features and bugs by the size of the effort required:

Size Duration
Small 1-2 days
Medium 3-5 days
Large 6-10 days
Extra large 11+ days

Defining Hours per Page

To determine how much time is needed for writing a page, we analyzed the output of our writers and editors during the course of a seven-month project. For printed deliverables, we compared the number of pages produced by each writer to the number of hours logged by that writer. We did not analyze help topics, because the process of creating those topics will change completely when our single-sourcing project is completed.

Before we could compile meaningful data, we had to answer the question of what constitutes a page. Some updates require only a few changes to certain pages, and others require new pages written from scratch. We applied percentages. A change to a page can count as 25 percent, 50 percent, 75 percent, or 100 percent of a new page, with every page that requires any kind of change counted as at least 25 percent. We defined a full page as about 300 words.

Our analysis yielded the following figures for projects of high, average, and low complexity, written by junior writers and senior writers:

Complexity Junior Writer Senior Writer
High 18 hours 14 hours
Average 9 hours 8 hours
Low 3 hours 2 hours

On average, a junior writer can write one page in 9 hours, and a senior writer can write one page in 8 hours.

We try to make time for a content edit of every writing task. After an edit, the editor and writer often have to send the file back and forth to resolve queries and ambiguities, and then the editor has to review the file before signing off. In our experience, these activities have required the following amounts of time:

Activity Metric
Content Edit 25 percent of the writing time
Back and Forth 10 percent of the writing time
Final Edit Sign-off 5 percent of the writing time

When we looked for benchmark data with which to evaluate our metrics, we found that our figures are roughly in line with the rest of the industry. The following sources are representative of what we found:

  • High-Tech Industries: 7 to 8 hours a pageSource:Managing Your Documentation Projects (Hackos)
  • Printed Documentation: 3 to 5 hours a pageSource: http://www.fredcomm.com (Fredrickson Communications)
  • Writing and Editing Tasks: 13 hours a pageSource: http://www.techwr-l.com (Jody Lorig’s Estimating Worksheet)

Collecting the Data

We collected our data from the spreadsheets on which our writers maintain their individual plans. Each writer’s plan has a separate sheet for each major type of work: new or updated features, bugs, special projects, and release notes. Each sheet includes columns in which the writer estimates the size and complexity of the effort, the number of new pages and modified pages, and the number of hours required. For example, the sheet for new or updated features includes the following columns (in addition to columns for scheduling and tracking the documentation).

Number of Pages

Effort

(S/M/L)

Complexity

(low/average/high)

New content Updated content # of hours Editor

Editing effort

(low/average/high)

For each feature, the editor verifies the entry in the Complexity column.

Analyzing the Data

Our initial analysis revealed that we had spent an average of 9.21 hours per page on the combined writing and editing tasks, a figure that compares favorably with the benchmarks.

Category Metric
Number of Writers 5
Number of Editors 1
Number of Pages (New and Updated) 758
Number of Hours (Writing) 4984
Hours per Page (Writing) 6.58
Number of Hours (Writing and Editing) 6978
Hours per Page (Writing and Editing) 9.21

We then entered the data into spreadsheets that facilitated comparison of our actual work distribution to the manager’s ideal distribution, which gave us a basis for projecting future needs for resources. For example, we entered the following data for a senior writer (Table 1).

MahRoh_Figure2

Table 1: Data for a Senior Writer

The data clearly show that, despite working many more hours than should be expected, this writer did not have time to address bugs, attend to project-related tasks, or work on special projects. The “Addtn Hours” column shows that to work on all of the projects her manager would like her to undertake, she would have had to work an additional 296 hours. In percentage terms, this writer’s time would be 183 percent occupied. Even factoring a 10-percent efficiency increase into the manager’s projections, this writer would be 165 percent occupied.

Table 2 contains the data for a low performer and identifies a reason why the senior writer had to devote so much time to new features. This writer took 38 percent leave instead of 20 percent.

MahRoh_Figure3

Table 2: Data for a Low Performer

Table 3a shows the resources that were devoted to each type of work and the resources available for each type of work in a seven-month period if all writers work standard 40-hour weeks. Table 3b shows that 5 writers did the work of 6.4 writers.

MahRoh_Figure4

Tables 3a & 3b: Resources

Table 4 shows the number and type of features documented by each writer.

MahRoh_Figure5

Table 4: Features Documented

This data provides an answer to management’s question and comment of “Where has your time gone? The last software release did not include very many new features.” Clearly, management’s focus on the larger features had resulted in substantial underestimation of the number of new features to be documented.

Table 5 provides useful data for determining development goals. The “Writing Page Efficiency” figure represents the average hours per page (6.58) divided by the writer’s hours per page.

MahRoh_Figure6

Table 5: Development Goals

Table 6 summarizes the percentages for all the writers and the lead. The table clearly shows that we were understaffed by 1.72 resources. With this data to justify our request, management approved requisitions for two additional writers.

MahRoh_Figure7

Table 6: Percentages for all Writers and Leads

Our metrics project not only gave us the quantified information needed by upper management, but also enabled the writers to quantify their productivity. Now, when they say that they have too much work to do, they can prove it. And they find it easier to realistically scope their upcoming projects. The data also sharpened our focus on areas needing more resources. In particular, it demonstrated the urgency of our need to reduce the load on certain writers. We now have a methodology that is working for us, and we will continue to refine it. CIDMIconNewsletter

Rohmer_DonaldDonald Rohmer

Citrix Systems, Inc.

donald.rohmer@citrix.com

Donald Rohmer is Principal Technical Editor for the Networking and Cloud division of Citrix Systems. In ten years with his own company, Editorial Consultants, Inc., of San Francisco, he worked closely with top executives of large and small companies and public agencies. At Citrix, he works with Archana Maheshwari to continuously improve the capabilities of the documentation team.

Maheshwari_ArchanaArchana Maheshwari

Citrix Systems, Inc.

archana.maheshwari@citrix.com

Archana Maheshwari is Senior Documentation Manager at Citrix Systems and manages teams in Bangalore and the US. She spent 22 years in the Middle East, Canada, and the US before moving to Bangalore in 2005 to set up the first Documentation group at Juniper Networks. Before that she worked with Cisco Systems for seven years in San Jose. An MBA, Archana has worked as an HR Specialist, Programmer, and an IT teacher before moving into the documentation field.

REFERENCES

JoAnn T. Hackos

Managing Your Documentation Projects

1994, New York, NY

John Wiley & Sons

ISBN: 0471590991

Fredrickson Communications <www.fredcomm.com/articles/detail/stop_guesstimating_start_estimating>

Jody Lorig’s Estimating Worksheet
<www.techwr-l.com/articles/estimatingworksheet>

103

 

We use cookies to monitor the traffic on this web site in order to provide the best experience possible. By continuing to use this site you are consenting to this practice. | Close