JoAnn Hackos, Comtech Services, Inc.
Technical communicators have always shown allegiance to standards. The Chicago Manual of Style represents the starting point for most department style guides. The Microsoft style guide demonstrates how to write about products that are based on the company’s operating system. Government style guides govern a variety of government-oriented publications. In the early days of my technical communication career, we considered the U. S. Geological Survey’s style guide to be the industry bible. The Modern Language Association (MLA) style guide governed the writing we did in graduate school.
Local standards are developed for internal use but they are usually established with some stated relationship to a published, standardized style guide. Technical communicators depend on a plethora of standards to write and edit technical content.
Despite our devotion to standards for writing, when it comes to technology, we are pure novices. We have tended to use proprietary technologies to support our work. Microsoft, IBM, Xerox, Hewlett-Packard, Adobe, Help-development systems, and so on – I’ve used them all over the past 30 years—every one of them based on a technology owned, lock and key, by a specific company.
So what’s the problem? Once you have your content in a proprietary system, getting it out is often a trick and frequently unsatisfactory.
Documentation standards have been around for some years
DocBook, which uses SGML and XML to define books and articles, was published in 1991 by Hal Systems and O’Reilly. It was accepted as an OASIS standard in 1998. DocBook defines markup to specify the parts of a book, including chapters, sections, front matter, appendices, and so on.
DocBook and SGML have not been widely embraced by the technical communication community. At about the time that SGML emerged (SGML has been an ISO standard since 1986), the technical communication community was thoroughly dedicated to the desktop publishing, WYSIWYG model. Tagged languages, which had been used in the 70s and 80s, were considered passé. Desktop publishing easily won out among technical writers. It gave them direct and detailed control over the look of the page, a significant advance over the typewriter look.
DocBook was more likely to be adopted by government agencies and others who recognized the need for a standard that would not become obsolete in a decade. DocBook provided and continues to provide that advantage. Because the SGML (now also XML) markup does not contain formatting information, formats can easily be modified, different formats can be applied to the same markup, and the end result is not dependent on proprietary formatting tools (like MS Word, FrameMaker, RoboHelp, and so on).
SGML and DocBook were also more expensive to implement than desktop publishing systems, largely because of the cost of editing software that supported writers rather than programmers using SGML markup, and because many DocBook implementations had to be customized to be usable.
Note that one of the significant issues of standards adoptions is tools. If tool developers support a standard, usable tools quickly become available. However, tools developers will not invest in a standard unless it looks promising. The more organizations that are interested in adopting a standard, the more useful and usable the tools support will be. We might argue that the high cost of SGML tools reflects the low level of adoption. When adoption is limited to government agencies with seemingly unlimited budgets and Fortune 100 companies, the tools vendors not only have restricted markets but they can charge high prices.
The DITA standard makes waves
The DITA (Darwin Information Typing Architecture) was accepted by the OASIS community in 2005 after several years of development at IBM. DITA, like DocBook, was designed to address the needs of the technical communication community. However, unlike DocBook, DITA is not book-centric. Instead, its architecture is based on the concept of the topic, defined as a standalone unit of content that can be mixed with other topics to form a book, an article, a help system, a website, and so on.
DITA further defines the topic architecture around the key concept of information typing. Rather than defining the parts of a book’s structure, as DocBook does, DITA defines the types of content that are typically contained in technical manuals. The core DITA information types are task, concept, and reference. Like the types described by Robert Horn in his work on mapping information types, the DITA types provide guidelines and templates that prescribe the content units that tasks, concepts, and references should contain.
DITA, based on standard XML tagging, has hit the technical communication community by storm. In fact, IDC predicts that fully 50% of the technical communication produced globally will soon be produced using DITA.1
The OASIS DITA Adoption Technical Committee, recently formed under the co-chairs JoAnn Hackos and Gershon Joseph, has taken up the task of promoting DITA adoption globally.
So—why should you adopt a standard?
As technical communicators and managers, you likely already understand the power of standards adoption. But let’s review them here.
- Standards reduce the cost of technology adoption for all organizations. A small cadre of willing volunteers keeps advancing the standards in response to the needs of the community for improved functionality and usability. When you implement your own technology, such as a customized XML-based content model, you are alone when it comes to updating and solving problems.
- Standards adoption gains the respect of information technology experts in our corporations. IT professionals, engineers, and software developers all embrace standards in their own work. They understand the importance of supporting standards development for technical communication.
- Standards mean convertibility. Moving from one proprietary system to another is fraught with difficulties as most of us have already experienced when we moved from various word-processing and desktop publishing systems to FrameMaker or from WordPerfect to Word, and so on. By adopting a standard, we ensure that we will be able to move content from one standards-based system to another should the need arise. Bob Gruen, now retired as the publications manager at SPSS, once told me that he had been using more than one standards-based content management systems over a 10-year span and had never lost any data.
- As standards are more widely adopted, tools support is strengthened. As we’ve begun to see with the DITA standard, more tools developers are adding their support. Tools costs continue to come down as more companies enter the market (leading, some day, to a consolidation of course). Recently, tools support for DITA has been extended to the ubiquitous MS Word, making it possible for enterprise players to enter the fray.
- Standards are interdependent. The OASIS DITA Translation subcommittee has supported the integration of DITA and the OASIS XLIFF standard to simplify the creation of translation packages from DITA maps and topics. The new OASIS Open Architecture for XML Authoring and Localization (OAXAL) Technical Committee takes DITA and relevant Open Standards from OASIS, W3C, and LISA to create a comprehensive new reference architecture for technical publishing. OAXAL is based on experience gained from a practical and very successful integration of the relevant standards. OAXAL provides a ‘best of breed’ approach to creating a DITA-based authoring, translation, and publishing solution.
STC Intercom will publish a special issue in November 2008 on standards for the technical communication community. To quote Annette Reilly, the chair of the STC Standards Council, “STC belongs to leading international and national standards organizations, sending volunteer members to participate in their working groups:
- International Standards Organization (ISO) and IEEE for software user and life-cycle documentation standards
- In the United States, National Information Standards Organization (NISO) for information management, metadata, and technical report standards
- W3C for web standards
- OASIS for DITA information standards
- LISA/OSCAR for translation language standards”
As part of its contribution to the standards world, STC supported the development of the new ISO standard. In June 2008, the International Organization for Standardization (ISO) and the International Electrotechnical Commission (IEC) published ISO/IEC 26514,Systems and Software Engineering—Requirements for Designers and Developers of User Documentation.
This new standard states as a requirement that user documentation shall be based on user and task analyses. If you have had difficulty getting your organization to support user studies, you might be able to convince them that following the ISO standard requirements would be in their best interests.
I strongly recommend that you review the new ISO standard. The best practices that we describe in that standard should become part of your organization’s process. With standards supporting you, your business case to senior management to institute industry best practice is significantly strengthened.
Standards are a tremendous benefit to those of us who have worked hard to promote industry best practices. It should be a requirement that all technical communicators understand and support our industry standards. Standards knowledge should be part of our hiring criteria and of our educational programs, as George Hayhoe explains in his forthcoming article in the special Intercom issue.
Now is the time for standards in technical communication. Remember that standards are not about tools but are about aligning your organization with the international community. Get started today!
To add your views, go to http://www.cidmblog.com. You’ll find this article and be invited to add your own comments.