Building The Quality In, Instead of Bolting It On…

Home/Publications/Best Practices Newsletter/2008 – Best Practices Newsletter/Building The Quality In, Instead of Bolting It On…


February 2008

Building The Quality In, Instead of Bolting It On…

CIDMIconNewsletterDavid A. Reid, NXP Semiconductors BV

In The Beginning There Was…

In the Technical Documentation Services (TDS) department of NXP Semiconductors (Netherlands), we have been developing our own quality checking tools and making them available online to our authors. We have a worldwide audience of technical authors, writing the Data Sheets, User Manuals, and Application Notes for our semiconductor devices.

To obtain some level of consistency, we used to have a ‘traditional’ process model, which I am sure you all recognize. Our flow consisted of us (TDS) supplying a template (in Adobe FrameMaker) and the authors (mostly technical engineers) putting the data into the document, then us checking that it adhered to the Publication Standard, and sending it back or reworking it before publication. When the author thought the document was finished, it was sent to our production team, where the appropriate standards were applied and the document edited up to our “Publication Standard.” After many ‘back-and-forth-cycles’ the final document was returned to the originating authors for approval and signed off by the manager of their department.

This process was incredibly time-consuming and a bit uncontrolled because the authors in the field sometimes did not have the time, skills, or knowledge to implement the Publication Standard properly, and our Publication Standard was open to individual interpretation.

We decided to do something about the problem. We started by simplifying the Publication Standard to a one-page list of checks which we called the “Acceptable Quality Level” (AQL). This list is a set of criteria which every document we publish must meet or exceed. The whole Publication Standard still exists, and this list is a calculated and deliberate extraction from the Standard.

We developed a procedure for each item on the AQL checklist, and the document is checked according to these procedures so that the checking is consistent, regardless of the person doing it. Most checks were black or white cases, but some require interpretation and have been revised to include exceptions. This process continues to evolve. When we want to improve or adjust the Quality Level, we change or add new procedures.

What became apparent after implementing these AQL checks was that a large number of the checks could be done automatically by software. Many of the results were ‘OK’ or ‘not OK.’ This fact prompted the creation of an AQL checking tool called the “AQL Assistant Suite.” We created a suite of tools that check different aspects of the document. Some of these I will discuss in principle rather than as specific cases.

The AQL Assistant Suite

The AQL Assistant Suite (Figure 1) consists of eight modules with each performing a particular functional test and providing an individual results page. The software is web-enabled and can be used by anyone with access to our intranet site worldwide. It is available 24/7 and therefore, handles different work hours around the world.


 Figure 1: AQL Assistant Homepage

Below is a short description of the current modules and how they help the authors to check their quality before sending the document for publication:

Forbidden Words Assistant

The Forbidden Words Assistant (FWA) checks for forbidden words in the data sheet, also checking for common misspellings or misspellings of trademarked words

Legal Checking Assistant

The Legal Checking Assistant (LCA) checks for legal issues in the data sheet, checking for correct text-insets dealing with the disclaimers, definitions, trademark statements, license statements, warnings, and cautions.

List Generator Assistant

The List Generator Assistant (LGA) creates a list of the tables and figures found in the data sheet. This module allows easy checking of title consistency and a check on accidental duplication of a table/figure title.

List Drawings Assistant

The List Drawings Assistant (LDA) creates a list of the graphics found in the data sheet. It shows if the drawings have been imported correctly by reference.

Pinning Diagram Assistant

The Pinning Diagram Assistant (PDA) creates a sorted list of the pins in a data sheet. The results can be used to easily check against the pinning diagram.

Symbol Scanner Assistant

The Symbol Scanner Assistant (SSA) checks all of the symbols it can find in a datasheet and compares them to the contents of the selected libraries.

Parameter Checking Assistant

The Parameter Checking Assistant (PCA) checks all of the parameters for symbols which it finds in the tables in a data sheet and compares them to the contents of the selected libraries. It reports any differences it finds. Note this tool only works on the data in tables which it can identify as probably containing both symbols and parameter descriptions. This tool requires SSA to be selected first and at least one library to be selected.

Variable, XRefs and Dictionary Assistant

The Variable, XRefs and Dictionary Assistant (VDA) lists all of the Variables, XRefs, and Dictionary items found in the data sheet. It shows which are correct and which should be fixed.

And there are three more related to the AQL Assistant Suite but accessed outside the tool suite:

The Graphics Checking Assistant (Figure 2)

The Graphics Checking Assistant (GCA) takes an SVG version of an EPS file created in Adobe Illustrator™ and checks the line widths, color, font, and font size of text and line width, color, pattern, and placement of lines in the drawing. It produces two reports; the first shows any deviations from the standard on the technical point, such as line width, or incorrect font size; the second report shows the placement in the drawing of the lines and text used. This gives a total picture in seconds of the quality of the drawing (according to the Graphics Standard).

grahic gca_usage_report

Figure 2: Graphics Checking Assistant

Automated Table Generator

The Automated Table Generator (ATG) takes either an XML or CSV file and produces a FrameMaker Table in MIF. There are a number of table formats (extracted from our table library) which can be selected for the look and feel of the table that will appear in the MIF file. There is also a ‘generic’ table option that creates a generic table of 1 to 9 columns.

Code Listing Assistant

The Code Listing Assistant creates a MIF file from an ASCII listing of Computer Source Code (or any other text file) with the correct structure for our templates.

Each tool in the AQL Assistant Suite can be selected by the user via a tick box on the input screen (multiple tool selection is also possible). The FrameMaker file is saved in MIF format and uploaded to the server. The other tools can be accessed via their own homepage on our Intranet web site.

The selected tests are run, and a single results index page is generated with links to the individual result pages of each test (the processing time is measured in seconds).

During the life cycle of the document creation process, the different aspects of the document can be checked as they are assembled into the final document. Only the required check, at the required time, needs to be run.

For instance, all of our documentation contains legal information, such as disclaimers, warnings, and copyright information, but these are usually one of the last things inserted into the document. Thus, in the earlier phases of document creation, it is not necessary to run the Legal Checking Assistant tool.

If, however, the author is creating a table of symbols, descriptions, and values, a check using the Parameter Checking Assistant can be run to see if the table contains only approved symbols and descriptions. This check can be done immediately after the table is created, before proceeding with writing the other text in the document.

It is not mandatory to check the document during creation, but we know that almost everyone now does so. As the data is used throughout the document, getting it right in the early stage saves a lot of re-work and cycle-time.

We maintain a library of all valid symbols in all of our technical documentation. I created special software called the “Library Management Support” (LMS), which is used by our Quality Manager to input and maintain the library contents and by all authors as an online reference. The libraries from the LMS are read into the tools and compared to those in the document. An online report is generated showing the incorrect and the correct terms.

Positive Results

One of the most amazing issues we resolved with the introduction of this tool suite was a significant reduction in the number of arguments with our authors. In the old days, they used to get on the phone to us and try to defend the reasons their version of a symbol was correct and ours was incorrect. Since the tool has been online, complaints, arguments, and never-ending discussions have almost completely stopped. People trust the tools.

We still maintain that the tools are not AQL checking tools but only “assistants” to help the authors submit documents of higher quality, with a shorter cycle-time, and more consistency with all of our other documentation.

The editors we have in our head office now work on refining the procedures, checking the documents against the AQL checklist, and advising/training authors on how to create better documentation.

One of the fascinating results of our authors’ leap of faith was the reduction in throughput times from submission to publication. At one time (the tools have been running for more than four years now), it would take about 100 days to create a document. The average creation time has been reduced to 22 days.

We carefully monitor the quality of our publications. There were many justifiable fears that our quality would drop when we implemented the AQL list and the new process model. These fears have largely proven to be unfounded. In fact, the quality of a number of documents is now considerably higher.

Therefore, by putting the tools in place to take the boredom and Repetitive Strain Injury (RSI)—inducing work out of doing the checking, we have better user acceptance, a lower throughput time, and higher quality. It is a department manager’s dream come true.

Author Acceptance

The authors creating the documents are extremely happy with the tools. We receive high praise and feedback from them. They can instantly see errors in their documents and correct them before sending them in for publication. Because the tools are available 24/7, our authors in the far east and on the west coast don’t have time-zone delays added to their throughput times.

The instant feedback—especially when everything is correct—is a source of pride for many of our authors. We often receive thanks from them after releasing a new tool. Their feedback and requests for new tools are accepted, evaluated, and used to create a specification for a new tool. The whole tool development process is user-driven.

To further reduce the throughput and cycle time, we have expanded the authority to perform an AQL check and approval for publication to authors outside our department after they have completed our AQL training course. We have a “continuous checking cycle” which is a random audit (and re-AQL Check) of documents, to constantly watch that the quality is not slipping and identify training opportunities where necessary.

We have a review board that evaluates the current AQL checklist, Standard, and the quality of our documents, which are closely tied to the creation of additional tools to add to the AQL Assistant Suite. For instance, we have a separate Graphics Standard for our technical drawings, and we have just introduced a checking tool for reporting on deviations from that Standard.

Graphics Checking Tool

Our procedure for checking that graphics adhere to our Graphics Standard required that each bit of text and each line be checked for the correct width, font size, and color. To do this work manually for each graphic takes approximately 30 minutes. Our “Graphics Checking Assistant” does it in seconds.

The users, based anywhere within the NXP intranet domain, can upload their drawings from Adobe Illustrator and view the reports that show the errors that must be corrected before publication.

Easy Conversion Tools

The second area touched by one of our online tools was to facilitate easy conversion from one media format to another, thus saving immense re-keying work.

One of the major headaches with FrameMaker is dealing with tables, and our documents are chock full of tables. We created a table-generator tool called Automated Table Generator (ATG) shown in Figure 3.


Figure 3: Automated Table Generator

The data to fill the tables comes from other tools (proprietary/commercial), or the data are delivered in an industry-accepted XML format (like SPIRIT/DITA). Therefore, the data is in a variety of formats, from ASCII, CSV, and XML as well as a few proprietary formats.

ATG uses table templates. These table templates are MIF files created from our released templates, with a single table inserted in the structure. This mechanism allows us to change/update the template easily. We only have to save our Table_Library as MIF and run a script that extracts the tables and creates the templates.

Management Resistance

Initially, we had quite a lot of management resistance to changing the publications model to incorporate the toolset. After some user trials and feedback from the authors, management has turned around and now wholly support the developments. They even insist that all documents must run through the tools during the AQL-Checking phase immediately before publication.

Their turn-around had to do with the management reporting functions that made them aware of the quality level of the documents. The results could be tracked from the first few times a particular author used the tools for checking when the number of errors was quite high, to two to three months down the line when there was, in every case, a significant drop in the number of errors for the same author. The reports allowed them to track the progress of any author and provide extra training targeted specifically to the needs of that user.

Library Management Support

(Figure 4)

For the Library management, we started with a simple CSV text format that was edited in a text editor. However, as our libraries grew from 1,200 terms to 18,000 terms, it was apparent that we needed some other kind of management system. The LMS software I designed enables our Quality Manager to view the status of all outstanding items, track their progress through the system, and view the log files in a graphical format as shown in Figure 4.


Figure 4: Library Management Support homepage

We developed an approval process to ensure the quality of each and every symbol. It is a standard process model with customer input, first review plus suggestions, customer approval, second pair of eyes, and then qualification status. Most of our terms are now completely through the process in less than a week, and many are finished in one or two days. They are immediately available for use in the checking tools. Before this process and tool, the throughput time for a symbol/parameter was 30 to 90 days (Figure 5).


Figure 5: LMS edit parameter screen

We implemented the check with simple status flags, which the two-person library support staff change as they complete each step of the process. Because a log file records every action, it is easy to trace any term through the entire process from conception to release for publication.

Try It… You’ll Like It…

Incorporating web-based tools where early checking is encouraged and embraced by the authors paid dividends that astounded us. The overall consistency of our documentation since the implementation of the tools has been greater than 78 percent. We measure our consistency through a process of post-AQL checking, where documents are selected at random and subjected to an intense and thorough examination with the faults logged and statistically graphed against perfection. It is a complex process, and we use it to monitor our quality.

Also, we have other software I developed for monitoring the quality of our libraries by cross checking and comparison. This software outputs a list of all inconsistencies and calculates a ‘Quality Factor’ for the libraries overall as well as individually. This data is then acted upon by our Quality Manager. With the benchmark figures, it is easy to see if something we implement has the desired effect on our quality. The quality of our documentation, in a recent independent survey, is in the top 10 percent of our industry.

Providing the tools early in the document-creation process provided its greatest payback in our time-to-market by reducing the throughput times and cycle-times to a minimum without sacrificing quality in the process.

Reducing Training requirements

Another effect of the AQL Assistant Suite is that our training program for new authors is much simpler and takes a lot less time. We have a section on interpreting the results of the tools and how to fix the problems reported. (This information cut three days from our training course).

The Development Process

We develop following a RAD (Rapid Application Development) <> model using an XP (Extreme Programming) <> methodology. RAD allows fast prototyping and easy implementation with a minimum of interaction between the modules.

I initialize, specify, and develop most of the tools. When the basic application is working, we hire an external professional programmer to review and re-write the code to ensure that it is maintainable. This method is optimal because our department has no budget for software development. Almost all of our software is written in perl, JavaScript, SVG, XML, MIF, and HTML.


Building the quality in, instead of bolting it on, provides the kind of service level we are proud of. The introduction of simple and specific tools to perform difficult, boring, or repetitive tasks made user acceptance much easier. We met almost no resistance from users. In contrast, we get requests to investigate new tools to solve problems that we didn’t even know existed, allowing the tool suite to grow and evolve.

The tools we have now are processing about 300-500 documents per month of an average length of 32 pages though some are over 200 pages. In the graphics area, we process over 700 drawings per month. The ROI for the tool was met in one month, saving on average 1400 minutes/month of time for the technical-drawing office alone.

The Future

We are already looking into other homemade tools to create entire documents from database XML data. We continue to evaluate, build, and deploy new tools to solve specific problems. Our tools enable our authors to concentrate on the technical content they write and less on the technical points of our Publication Standard.

The developments in XML, and particularly DITA, prompted us to start looking at generating XML blocks from our existing FrameMaker data. The initial results from my programming efforts are very encouraging. I believe that when the day comes that we switch to DITA/XML, we can preserve our legacy data with simple tools to extract the content from our documents and automatically create DITA/XML topics. We hope for something like: upload a FrameMaker document and download 50 to 60 DITA compliant blocks—in seconds. CIDMIconNewsletter

dreid.1David A. Reid

NXP Semiconductors BV

David’s background is in electronics digital design and product creation. His own consultancy company (DRCP) has been running for more than 18 years. He is British but educated in America (Houston, Texas) and now resides in a countryside village near Eindhoven (Netherlands) with his wife and child. David was a technical author for many years before stepping over into a documentation support role when he joined Philips Semiconductors (now NXP semiconductors) in 2000.Since that time, he has been involved in running projects to aid the work of technical authors and those persons connected to the documentation process.