Bill Gearhart, Comtech Services, Inc.

The Center for Information-Development Management (CIDM) named the RSA division of EMC as its 2013 Rare Bird Award winner at the annual CIDM Best Practices conference in Savannah, GA. The RSA team, headed by Kevin Kyle, submitted a balanced entry to improve the quality of its content. The team began by acknowledging the need to improve its information set and then created a focused, repeatable, and sustainable process to deliver results.

There are six key milestones in RSA’s content improvement plan process:

1. Scorecard assessment
2. Stakeholder feedback
3. Initial list of content improvements
4. Cross-functional buy-in and prioritization
5. Approved content improvements
6. Completion of improvements

RSA built a balanced scorecard evaluation sheet for a high-level heuristic assessment of the document. This assessment serves as a baseline for future improvements, identifies the strengths and weaknesses of a set of content, and makes high-level recommendations.

After a high-level assessment from within the team, RSA gathers feedback from stakeholders—external customers as well as from internal stakeholder groups. The feedback from customers included interviews and data-gathering, as well as a baseline comparison to competitors.

Next, the proposed improvements are subjected to an agile method for prioritizing and resourcing. The cross-functional stakeholder groups participate in the prioritization and have full buy-in. Finally, the team executes on the planned improvements and then begins the cycle over again.

The RSA Technical Publications content improvement process has been successful because of strong support from Technical Publications management, involvement from customers and internal stakeholders, and a focus on execution and repeatability.

From the initial creation of the process to its implementation on each project, RSA Technical Publications management has assigned resources to identify the content improvements and complete them. This included assigning quarterly MBO goals to the writers to ensure completion of the improvements targeted for the quarter. This helped ensure successful execution, so that the improvements were not viewed as ‘nice to haves,’ but rather an integral part of successful content.

Other Rare Bird Award entries included:

Jack Henry Documentation Forecasting, Jack Henry & Associates, submitted by Emily Saltsgaver, Buddy Lee, Leann Long
The Jack Henry forecasting process allows the team to be proactive planners and manage its resources as effectively and efficiently as possible. Because of this process, the Jack Henry Documentation team can:

  • Fully utilize existing resources
  • Determine resource availability for new projects
  • Prove resource needs
  • Determine which resources are over-allocated
  • Protect resources
  • Prove resources are fully engaged

Additionally, by comparing forecasted work and compare that data to actual hours, Jack Henry has identified and corrected inefficiencies in their processes, leading to increased utilization of resources.

EMC’s To the Point: User Centered Writing Challenge, EMC Corporation, Barbara Liberty
EMC’s structured authoring initiative was not advancing as quickly as desired due to writer reluctance with new tools and new processes. EMC received customer feedback emphasizing the need for content that was task based and focused. They wanted to accelerate adoption of user centered writing principles in combination with the reuse efficiencies of structured authoring. This reward challenge was explicitly developed to:

  • Build user-centered writing experience and expertise
  • improve user experience, particularly for suite products
  • Create a feedback loop to improve subsequent documentation

Huawei’s Integrating Documentation Quality in the Product Development Lifecycle, Huawei, Farhad Patel
Improving documentation quality was one of Huawei’s “TOP5” improvement initiatives in 2012. Within Huawei’s TOP5 improvement initiatives, documentation got the highest improvement score at the end of 2012 when the North American project office summarized improvement achievements.

This significant increase in customer satisfaction scores in 2012 can be mainly attributed to the phased end-to-end documentation quality process designed and implemented by the documentation team.

Citrix’s Project DICoRe, Citrix Systems, Braulio Fernandes
DICoRe (DITA Command Reference) is a solution developed by the Citrix NetScaler Documentation Team to solve issues that they faced with respect to parameter descriptions. NetScaler relies on command/parameter descriptions that reside in Perl Modules. The descriptions from the Perl Modules files were not sourced into the document. The team used the descriptions from the Perl Modules as a base copy, got them reviewed, and then manually added them in the documentation. This resulted in the following challenges:

  • Inconsistency in descriptions across interfaces
  • Old and incorrect descriptions in eDocs
  • Missing descriptions in eDocs
  • Customer queries because of the above
  • Bad user experience

In addition, time spent by writers on copying future descriptions in to eDocs topics would have been significant and the problem of maintaining consistency would have always been a nightmare.

The Citrix team’s solution was simple and elegant, but required much coordination and work:

1. Create a Command Reference (CR) from the PM files (to ensure single sourcing from the PM files.
2. Post the CR on eDocs (public site).
3. For every occurrence of a command within the content, provide a live link to the description in the CR.