Content Quality Management Practices at Symantec

Home/Publications/Best Practices Newsletter/2007 – Best Practices Newsletter/Content Quality Management Practices at Symantec


February 2007

Content Quality Management Practices at Symantec

CIDMIconNewsletter Alexia Idoura, Symantec Corporation

Symantec is a global software company with global communication needs in a rapidly changing industry. Symantec does business with 99 percent of the companies listed on the 2005 FORTUNE 1000 list. We provide content in more than 30 languages for some of our products. We have facilities in more than 40 countries, with information development teams at more than 20 sites in 12 countries. Very few of our teams are entirely co-located with their development teams now; most work with development teams that are also geographically dispersed.

We are being pressured to produce more, better, and faster. What exactly is behind this pressure?

The Pressure to Produce More, Better, Faster

Several factors are increasing the pressure for companies to do more, better, and faster:

  • increased competition in the global market
  • increased customer expectations
  • changes in customer demographics
  • changes in company dynamics

No one can argue that competition continues to increase. Earlier this year, 85 percent of managers across several major industries said competitive intensity had increased since five years ago, with the high- tech industry experiencing the greatest change. Managers attributed the accelerating change to improved competitor capabilities, more low-cost competitors, more competitors, bigger competitors, more innovative market entrants, regulatory changes, rising consumer awareness and activism, and a growing number of attractive and accessible markets (McKinsey, April 2006).

Customer expectations are also on the rise. Customers are more sophisticated, more demanding, and concerned with quality-not just cost. In his keynote address at the Best Practices 2006 conference, Scott Wahl from Research In Motion pointed out that people are used to creating, collaborating, and sharing content directly. They connect into global communities. They have direct access to a vast set of information. What this means is that customers expect to have access to the information they need, in real time, at all times. (Wahl 2006) Our profession has been talking about the right information, the right place, the right time for more than a decade. The difference now is that it’s expected rather than just an ideal.

Our customer demographics are changing rapidly. The bulk of our customers used to be based in the United States. Now, the majority of our customers are outside the US and their members are rising. In less than a decade, the spending power in emerging economies is going to rival the current spending power of Western Europe. In the next decade, the world is going to see one billion new consumers from Asia. Within the next 20 years, centers of economic activity, high tech in particular, are going to shift, moving to Asia and within Asia (McKinsey, January 2006).

As a result, companies are constantly implementing changes to maintain their competitive edge. Companies are experiencing rapid growth via mergers and acquisitions. Companies are reorganizing. Companies are reprioritizing their portfolios. Companies are reinventing the way they source work.

As Albert Einstein said “In the middle of every difficulty, lies opportunity.”

Our Opportunity

For Symantec, focusing on source language quality has led to substantial savings (time and money) and lets us achieve “more, better, faster.” This focus supports several objectives of our larger Symantec Unified Content Strategy (UCS) initiative, the mission of which is to create high-quality, Symantec-branded information to deliver a consistent, relevant, and useful customer experience. Those objectives are customer experience, reuse, and localization.

In our own profession, quality is often seen as aesthetic, expensive, and time-consuming- something that needs to be justified, that would be nice to have once everything else is done, and that is going to add time and cost. In other words, quality is something to be defensive about. However, by being aggressive about pursuing source language quality, we contribute to our company’s ability to be competitive. Rather than a nice-to-have, Wahl referred to quality as the distinguishing feature at the Best Practices 2006 conference.

The Bottom Line

Before I talk about how our focus on source language quality gives us a competitive advantage by supporting customer experience, reuse, and localization, let me quickly point out the benefit to the bottom line. Figure 1 shows the unit cost of fixing terminology errors at different points in the content supply chain.

Figure 1. Unit cost of fixing terminology errors at different points in the content supply chain (Schütz and Nübel 1998)

The data illustrate what we all know, but what we don’t always do: moving quality control upstream reduces costs.

Key Benefits of Focusing on Source Language Quality

Focusing on source language quality, of course, improves the quality of the content and the overall customer experience with our products. Here, I’m referring to quality as a goal in and of itself rather than a means to an end. Customer perceptions of content quality are statistically correlated to product quality.

Focusing on source language quality enables the reuse piece of our strategy. By reuse, I’m considering three different scenarios. They are all commonly referred to as reuse in the industry, though we distinguish among them. They are

  • reusing content across projects and functions (sharing content between Information Development and Support)
  • repurposing content for different delivery methods (extracting help and PDFs from the same source)
  • leveraging content that was already written and translated for a particular product (preserving content in all languages from a previous version as-is so it can be reused in a new version, including the conversion process from legacy formats to XML)

Focusing on source language quality lets us provide volatile content nearly simultaneously to customers worldwide. What does that mean? By controlling the source, we can automate more easily. By automating (using machine translation and translation memory as part of the localization process), we can ship products in multiple languages simultaneously or close to it.

A few conditions complicate our situation. We have grown and continue to grow by mergers and acquisitions. Every company that joins us has its own standards, its own history, and its own goals. Information developers are decentralized and spread across business units, across projects, and across the world. Some of them report directly to engineers, some of them are part of semi-centralized teams led by information development managers. Not only are information developers decentralized, but more work is being outsourced. We are working with multiple agencies around the world. Some teams have worked with editors, and others haven’t. Some teams are familiar with the need for style compliance, and others, whether by training or culture, are not.

How We Addressed Source Language Quality

To control source language quality, we addressed three areas:

  • standards
  • processes
  • technology


Integration of all three of these areas was the key. Incremental, isolated tweaks were not going to get us where we needed to go. Only system-wide process and tools changes were going to work.

For standards, we started with relevant standard styles, spelling, and unambiguous grammar. On top of that, we layered terminology and structure. By terminology, I mean standards around which terms we want to use and what they mean. By structure, I’m referring to the architecture of the content.

Figure 2. Before harvesting terminology

Figure 3. After harvesting terminology; one term per key concept

The standards best practices we found especially important are as follows:

  • having published style guides; training people on style standards; having a process to update the style guides; notifying people of changes
  • harvesting terminology from our content; identifying one term per key concept; identifying the terms we want to use and the terms we don’t want to use, as shown in Figures 2 and 3
  • providing editing services to product teams; connecting with other editors throughout the company to agree on standards issues
  • developing standard content models at various levels – deliverables, topic types, and smaller elements such as lists – to complement our DTD and translate it into something our information developers could work with more easily
  • providing auditing services to focus on structural compliance, primarily as a pre-conversion effort
  • providing production review services to help teams create very clean XML to minimize localization and production issues at the end of the cycle
  • providing training and support to content developers around the world, whether they are inhouse in our home markets, outsourced, or offshored
  • communicating with internal customers to remind them of the importance of standards; setting and managing expectations; tackling assumptions; identifying and addressing issues as soon as possible
  • communicating with our partners, such as localization


Once we established our standards, we needed to take a close look at our processes. Processes were probably our biggest challenge. We needed to do the following efficiently and effectively:

  • plan and forecast
  • write and edit
  • review
  • translate
  • publish

We developed several best practices, including

  • using supply chain models to evaluate the end-to-end treatment of content assets and focusing on problematic communication and handoffs between stages in the process rather than the stages themselves
  • standardizing processes, including communication and handoffs
  • implementing controlled authoring (compliant with relevant context-sensitive rules and terminology) rather than controlled language (subset of natural language with restricted grammar and vocabulary)
  • providing real-time feedback to content developers for controlled authoring compliance (which essentially becomes continuous self-training) and aggregated data to managers
  • developing an integrated workflow
  • moving from reactive processes (identifying and standardizing terminology during the localization stage) to proactive processes (harvesting terms during the writing process)
  • continuously evaluating how we are doing and making improvements
  • outsourcing and offshoring when appropriate (for example, for skills we didn’t need to preserve inhouse or for skills outside our core competencies)
  • providing training and support
  • communicating with internal customers and our partners


I deliberately put technology last because, to be successful, we really needed to establish standards and processes first, and then apply technology appropriately at that point. Of course, the improvement cycle is continuous, so when we implement technology, we uncover standards and process issues. We continue to evaluate all three areas and make needed changes in a controlled way.

We implemented several pieces of technology so we could reach our goals, as shown in Figure 4:

  • acrocheck for controlled authoring, which provides custom rules based on our style guide and machine translation needs, custom terminology databases, and out-of-the-box grammar and spellchecking
  • Vasont for content management
  • XMetaL as our authoring tool
  • Trados for translation memory
  • Idiom Global WorldServer for localization workflow
  • Systran for machine translation

Figure 4. Technology We Used to Meet Our Goals

We have several key best practices around technology:

  • complying with industry standards so we are not limited to proprietary technologies
  • developing an integrated technology stack
  • choosing projects deliberately based on established criteria rather than letting enthusiasm be the primary driver
  • running small pilot projects for each team
  • breaking large projects into smaller phases so they are manageable
  • providing training and support
  • communicating with internal customers and our partners

Has our strategy worked?

Has our strategy worked? Yes.

So far, we’ve had savings of as much as $1 million US in a year with projected savings of $3 million US per year for the next five years.

Translation costs for machine translation projects have been cut by 50 percent. Time to market for machine translation projects has been cut by 50 percent.

We have achieved 40 percent reuse, 90 percent repurposing, and 75 percent leverage, with more anticipated over time.

To recap, for Symantec, focusing on source language quality has led to substantial savings of time and money. The key has been to have an integrated, well-developed strategy, to implement an accompanying plan and to communicate, communicate, communicate. CIDMIconNewsletter

About the Author


Alexia Idoura

Alexia Idoura is a senior manager at Symantec, heading the Standard Delivery Team, a global group providing editing and publishing support to information developers companywide. In addition to overall source language quality, two of her major responsibilities include helping teams prepare content to meet machine translation requirements and managing the conversion process from legacy formats to XML and content management. Alexia coauthored “Moving to Single Sourcing: Managing the Effects of Organizational Changes” for the August 2003 issue of Technical Communication.  She was also the technical editor for FrameMaker 5.5 for Dummies and FrameMaker 7: The Complete Reference and has contributed to other commercial technical books.