Content Improvement through Content Analysis: A Case Study from Lucent Technologies IP&T Organization

Home/Publications/Best Practices Newsletter/2007 – Best Practices Newsletter/Content Improvement through Content Analysis: A Case Study from Lucent Technologies IP&T Organization

CIDM

February 2007


Content Improvement through Content Analysis: A Case Study from Lucent Technologies IP&T Organization


CIDMIconNewsletter Norma Emery, Lucent Technologies

With faster turnaround times, increased workloads, and multiple projects to support, the information developers of the Lucent Technologies Information Products and Training (IP&T) organization are experiencing greater challenges in ensuring the quality of their information products (IPs). To help them to continue to deliver high-quality IPs on time and to meet IP&T’s goals for quality and customer satisfaction, the IP&T Editing Team was asked to conduct a Content Analysis Project. The project involved analysis of content developed in three geographic regions: North America, Europe, and India.

For this project, content was edited, and statistical and descriptive results were compiled and provided to stakeholders. While nominal differences were identified in the overall quality of the content from each region, some areas for improvement and patterns of noncompliance emerged within each region. Data derived from the evaluation yielded information that was useful to IP&T management in measuring the success of its workforce globalization initiatives, in identifying priorities for content improvement, and in determining priorities for the implementation of an automated editing tool.

Project Goals

The Editing Team’s primary goals for the project were to:

  • Identify differences in quality in the level of writing across the regions
  • Identify the most prevalent writing and editing issues by region
  • Obtain baseline metrics for evaluating future improvements to IP&T content
  • Obtain input for the implementation of an automated editing tool

Project Activities

The project involved the following major activities:

  • Developing a methodology
  • Selecting content and IP samples
  • Determining the standards to be applied
  • Defining the editing criteria
  • Developing the reporting tools
  • Editing the content objects and recording the results
  • Evaluating the information products
  • Developing recommendations
  • Performing appropriate follow-up activities

Developing a Methodology

At the outset of the project, the Editing Team determined that a high-level methodology was needed to obtain useful results. We recognized the need to pinpoint, with a reasonable measure of objectivity, the types of errors that were recurring in both content objects and in complete documents. We also recognized the need to define our reporting deliverables, to the extent possible, before we conducted any editing work. In addition, consistent criteria would need to be developed for selecting and for editing samples.

To record our methodology, we chose to develop a flowchart depicting the steps we would follow. As work on the project progressed, we refined our methodology where we identified process gaps, modified the flowchart accordingly, and fine-tuned our reporting deliverables. For example, while gathering the content samples, we used estimated page counts for the content objects. Once we recognized that the data for the number of errors identified per page would yield statistically inaccurate or invalid results if we used only estimated page counts, we elected to set the page length to a specific value and modify our methodology to use a fair and objective page length.

Selecting Content and IP Samples

An evaluator from the IP&T Editing Team chose sample content of various content types from IPs that were developed by IP&T information developers located in each of the three regions. The names of the authors of the IPs from which the content was chosen were unknown to the evaluator. All content objects were chosen at random from a range of possible IPs identified for evaluation by senior managers. An additional criterion for each IP from which content was selected was that the IP must be in published, not in draft, form.

The Manager of the Editing Team then identified categories of content types to be evaluated. The following categories were selected:

  • Descriptive content
  • Procedural content (hardware)
  • Procedural content (software)

The sampling of content used for the content editing component of the project was relatively small (44 objects for a total of 60.12 pages). The Editing Team believes that, should the sample grow and the project be repeated, even more reliable data can be obtained. For the high-level assessment component of the project, 12 IPs were evaluated.

Determining the Standards to be Applied

In editing the sample content, the evaluator applied the following Lucent House Style (LHS) standards:

  • LHS Writing Standards
  • LHS Word List
  • LHS Abbreviations Database

The following non-IP&T reference materials were also used for the project:

  • Merriam-Webster’s 11th Collegiate Dictionary
  • Chicago Manual of Style, 15th edition

As the editing activities described in the next section were performed, the evaluator also identified necessary enhancements to the LHS Writing Standards. The enhancements were entered into a tracking mechanism for future updates to the document. Thus, the project also yielded the additional benefit of identifying where IP&T standards needed improvement or clarification.

Defining the Editing Criteria

Once the sample content was identified, the Editing Team created an Editing Checklist against which to edit the content. As editing of the content progressed, the number of checklist items was reduced to eliminate editing activities for which few, if any, issues were found in the content. Through this process, the team was able to hone the checklist to the following editing activities:

  • Abbreviations and symbols are correct, consistent, unambiguous, and in
    conformance with the LHS Writing Standards.
  • Comprehension of content is acceptable; content is easy to understand, and sentences are not overly complex.
  • Grammar and syntax are correct.
  • Information Mapping® map types used to structure the content are appropriate for the type of content being conveyed.
  • Information Mapping® principles (scannability, chunking, relevance, labeling, consistency, integrated graphics, accessible detail, hierarchy of chunking) have been applied to the content.
  • Parallel (grammatical) structure is assured.
  • Pronoun use is correct based on the LHS Writing Standards.
  • Spelling errors are corrected.
  • Terms are used consistently.
  • Verb mood, person, tense, and voice are used appropriately in the context of the sentence or paragraph.
  • Word choice and usage (acceptable ways of using words and phrases) are correct.
  • Punctuation adheres to established standards.

Developing the Reporting Tools

As one of our project reporting mechanisms, the IP&T Editing Team developed an Excel spreadsheet, with multiple worksheets, to track data derived during the evaluation of content objects. In the Content Objects Summary worksheet (Figure 1), the evaluator entered descriptive data—such as the page and word count, the region in which the content was developed, and the type of content—about each sample object. The evaluator then assigned each content object a Content Object ID, which was used to reference each content object in all other project deliverables. This figure does not represent the actual results of our analysis, but is intended to show the structure of the tool we used to report on the data.


Figure 1. Content Objects Summary Worksheet (Example)

For each region, the Editing Team also developed an Editing Results worksheet to record the results of editing the objects from that region. The Editing Results worksheet is depicted in Figure 2. This figure does not represent the actual results of our analysis, but is intended to show the structure of the tool we used to report on the data.

This worksheet listed each activity from the Editing Checklist (Column B in the figure) and used as column headings the Content Object IDs from the Objects Summary worksheet (Columns E through J in the figure).

Editing the Content Objects and Recording the Results

With the sample objects chosen and the editing criteria identified, the evaluator edited each object. While editing an object, the evaluator classified each edit according to the editing activity number (Column A in Figure 2) from the Editing Checklist and entered this number as a comment, along with the editorial markup, directly into the edited object. The evaluator also entered editorial markup comments into the appropriate cell of the Editing Results worksheet for ease of reference. The evaluator then searched the comments in each edited object to tally the number of edits for a given editing activity within that content object. The resulting totals were then entered into the appropriate cell of the Editing Results worksheet (Cells E2 through J13 of the figure). Total edits by editing activity across the region were also tallied (Column D in the figure) and ranked (Column C). This latter task identified the types of errors that were most prevalent within a given region for each content type.


Figure 2. Editing Results Worksheet (Example)

In addition, the number of edits per page (for all editing activities) for each content object was calculated and entered in the Content Objects Summary worksheet (Column M in Figure 1). The number of edits per page for all content objects within a given content type was also calculated (Cell M11 in Figure 1), yielding an average number of errors per page within a given content type by region. To help ensure an accurate reading of errors per page, the evaluator ran word counts on each object, and then divided those word counts by a benchmark value of 200. The value of 200 was chosen because this is the average number of words per page (for English content on an 8.5- by 11-inch page) that was observed by our Translations group. Character counts were also run on each object. The latter value was then divided by the former value for each object to obtain the average word length for each object. Average word counts by region and by region and content type were also reported in the results spreadsheet. The evaluator then aggregated all of the numerical data into other summary spreadsheets, which are not shown here, so that the data could be further analyzed from various perspectives.

Evaluating the Information Products

In addition to the detailed numerical analyses performed on selected content objects, a more subjective evaluation was also conducted of four IPs from each region (for a total of 12 IPs). This evaluation consisted of scanning the text to assess such qualitative and structural issues as the relevancy of headings to the content, the length of procedures, the presence of dense blocks of text, and the quality of indexes. For this phase, the evaluator also obtained counts, at the IP level, of general usage errors—that, if present, often signal content in need of improvement—and a count of usage errors based upon deprecated terms identified in the LHS Writing Standards. For example, the evaluator searched for and tallied the number of occurrences of expletives, such as “there are” and “it is.” The Editing Team believes that such data can serve as a leading indicator of the level of effort that would be required to bring an existing IP&T information product into conformance with the LHS Writing Standards.

Developing Recommendations

Based on the samples we evaluated, the Editing Team identified no extreme differences in the overall quality of the content across the regions. However, within each of the categories of content and for each region, we observed trends in the types and quantity of errors of a given type. In addition, we were able to identify the top five areas for improvement within each region.

Using the data from this project, the IP&T Editing Team developed a set of recommendations and submitted those recommendations to IP&T management for consideration as quality improvement initiatives. Some of these recommendations included the following:

  • Conduct the assessment periodically, using additional content samples, for high-priority projects.
  • Integrate the results of such assessments into the existing data to update the baseline data.
  • Write a new section on “Usage” for the LHS Writing Standards.
  • Ensure that terms that were identified as “Usage” issues are captured as deprecated terms (with appropriate replacements) in the automated editing tool.
  • Prioritize rules development for the automated editing tool based on those editing activities with the highest number of errors.
  • Use the “Average number of errors per page” within a given content type and region to assess, for example, the types of content that could be more suitably written in each region.
  • Use the “Average number of errors per page” values as baseline data against which to judge the success of IP&T’s ongoing efforts to further improve the quality of its information products.
  • Use the ranking data for each editing activity—in conjunction with the comments in each cell—to guide development of targeted job aids and training modules for information developers. Development of such job aids should be prioritized based upon the top five ranking errors.

Performing Appropriate Follow-up Activities

As a result of our recommendations, the Editing Team was tasked with undertaking a multi-phase initiative to support IP&T information developers and to improve the quality of our information products. In the first phase of our follow-up to this analysis, the IP&T Editing Team developed two detailed Editing Checklists (a Developmental Edit Checklist and a Copyedit Checklist) to be applied to content by information developers. We also developed a detailed editing process document and job aids for the various editing methods (electronic and manual).

In the second phase of our follow-up, we wrote a four-part Editing and Writing Workshop and delivered the workshop, in a virtual classroom environment, to information developers in one of the regions. The workshop specifically targeted the areas identified in our evaluation of content for that region. For example, in addition to providing general guidance on grammar, style, usage, and punctuation issues, the workshop stressed adherence to internal standards and provided detailed information on where to locate appropriate standards. Our lesson on usage also provided examples taken directly from the content evaluated and provided suggestions for making informed word choices.

The workshop was well received by the participants and earned an overall satisfaction score of 4.23 on a scale of 5. From our survey of workshop participants, the IP&T Editing Team learned that examples and exercises are key to teaching sound writing and editing practices to the audience we serve.

Currently, the Editing Team is also assisting in the implementation of an automated editing tool that we expect will further improve the quality of our information products. The results of our Content Analysis Project will, we believe, be highly useful in ensuring that our implementation of the automated tool is both effective and successful.

For the third phase of our follow-up, we plan to develop other tools and workshops that will assist IP&T information developers in their ongoing efforts to improve writing quality and customer satisfaction. CIDMIconNewsletter

About the Author

NEmery

Norma Emery
Lucent Technologies
neemery@alcatel-lucent.com

Norma Emery holds a BA in English from Hood College, Frederick, Maryland, and is currently completing her MA in Humanities at the Hood College Graduate School. She has more than 25 years of experience in content and document development with an emphasis on content reuse. As a Senior Technical Editor for Lucent Technologies, Norma is involved in technical writing and editing, research, analysis, project management, and internal standards. Her personal interests include spending time with her sons, studying French, painting, and contributing her time to environmental issues in Washington County, Maryland.

 

 

We use cookies to monitor the traffic on this web site in order to provide the best experience possible. By continuing to use this site you are consenting to this practice. | Close