Kathy Madison, Comtech Services
August 15, 2020

Most content creating organizations have quality assurance processes to ensure their content is accurate and complete, and meets user requirements. But how do you know if your team is spending too much or too little time reviewing and tweaking content or how do you know if your quality is “good enough” for your audience? We hope to find the answers to these questions in this year’s CIDM Member Benchmark Study. We’ll investigate how member’s quality assurance processes stack up to their peers and what level of maturity the community has achieved relative to Comtech’s Information Process Maturity Model (IPMM).

The IPMM evaluates an organization’s information-development processes in terms of 11 key characteristics of high-performing organizations; quality assurance and quality management are two of these characteristics and the focus of this year’s benchmark study.

Quality assurance represents a series of activities specifically designed to promote uniform high standards of quality, including copyediting, developmental editing, peer reviews, and technical reviews of draft information products. Each of these quality assurance actions can be fraught with difficulties, especially when project schedules are seriously curtailed. Yet one very important mark of a mature organization is the ability to ensure quality within a high-pressure environment.

In contrast to quality assurance, which focuses on the organization’s ability to conform to internal standards, quality management considers the organization’s ability to meet user expectations. Quality management is a hallmark of a mature information-development organization. By understanding users and how they use information, organizations are better prepared to design more effective and powerful communications.

The benchmark study investigates these characteristics in depth by asking members the following high-level questions via a survey and follow-up interviews:

  • How is quality defined?
  • Who’s driving quality standards?
  • What quality standards are in place?
  • How are quality standards enforced?
  • How are content developers trained to use the standards?
  • What post-production user-focus activities are in place?
  • What factors influence content quality?
  • Do demographics effect content quality?

The study also looks at inclusionary practices to see how organizations are addressing concerns about potential offensive terminology in their content. Are they replacing terms such as blacklist, master/slave, and disabled? Though not directly related to quality, the trending topic is an essential component of user satisfaction.

We designed the survey and interview questions to help CIDM understand where its members, as a whole, score on the quality processes of the IPMM maturity scale as follows:

  • Level 1, Ad-hoc: No quality standards are in place, and writers are responsible for their own quality.
  • Level 2, Rudimentary: Quality standards exist, but are not enforced.
  • Level 3, Organized and repeatable: Quality standards are in place and are enforced through editorial reviews, which are an uncompromised part of the content development workflow.
  • Level 4, Managed and sustainable: Quality standards are institutionalized and followed by habit. Editors are able to concentrate on developmental editing rather than enforcing standards.
  • Level 5, Optimizing: Quality assurance standards and activities are regularly evaluated and continuous improvements made.

The benchmark study kicks-off in late August and the results revealed during the virtual Best Practices Conference, November 16-18. Participation in the study is a benefit of CIDM membership. If you are interested in participating in the research and becoming a member, please contact [email protected].