Putting the “M” in CMS

 

CIDM

April 2009


Putting the “M” in CMS


CIDMIconNewsletter Harvey Greenberg, XyEnterprise

When DITA met content management for the first time, the initial challenge for the tools vendors was to support the specification, starting with map support and integration with the Open Toolkit. Vendors needed to support linking, specialization, bookmaps, and more. Support efforts will continue as the specification evolves. While the importance of supporting the spec is obvious, what sometimes gets lost in the discussion is the value proposition of content management technology, especially given successful implementations of DITA using the file system and source code control. This article discusses just three areas in which content management can add value: reporting, project management, and quality assurance. Note that some of the case studies provided are not DITA and not even XML, but the lessons learned are universal.

Reporting

Reporting is included with most, if not all, CMS products, and it is very likely to be included in CMS requirements statements, demos, and proposals. The requirements seem to gravitate towards one of two ends of the spectrum: standard reporting capabilities that may or may not provide value in a production environment or laundry lists of items needed with little regard to where information will come from. Like many aspects of CMS projects, the really interesting discussions begin when the customer/vendor team is formed and mind sharing happens. And when the customer goes on to integrate content management with their business process, the results are simply amazing.

An example of such results can be seen through the XML process in place at an international supplier of plumbing products. During the initial implementation, which used a simplified DocBook content model, the main focus was on reuse and reliable output of XML to PDF. A few custom reports were specified, but if they were used at all in production, they certainly did not get much attention. Over time, as the new editorial workflow and publishing process settled in, the system administrator, who happened to have a SQL background, became interested in how more information could be captured by and extracted from the system. Using additional metadata and queries, he produced a family of web-based reports available on the company’s intranet that span everything from the micro (project details) to the macro (performance metrics) and are essential in running the publications part of the business.

The screen snap (Figure 1) illustrates what happens at the intersection of technology and imagination. The so-called “Drift Date” report is triggered when projects are placed on “hold” or when dates shift. It brings together not only the project at hand but related projects, publications, and cost centers that are tied together by metadata. Suffice it to say, this report, which was never envisioned during the RFP stage, arms managers with information that would have been difficult to obtain without the CMS. The information supplied by the reports now produced allows managers to be proactive in spotting problems and resolving them.

Greenberg_Figure 1

Figure 1: Example of a custom report

Quality

The ability of a CMS to provide history, version control, and an audit trail promotes quality. Of course, those same features are found in source-code control systems, so perhaps we should look beyond the basics for additional models of quality. One case study that illustrates some best practices is a Microsoft® Word implementation at a UK-based legal and financial publisher. This customer acquired XyEnterprise’s Contenta CMS so that they could author in Word and dynamically convert to XML using a Word XML product. They wanted to reduce typesetting costs. Believe it or not, the movie ended well but not without some speed bumps, not the least of which appeared when we started to think about how to map the Word content to the proposed DTD.

After briefly flirting with canceling the project, an internal XML consultant was invited from another division of the company. To everyone’s surprise, he proposed using the CMS to adapt workflow around the current Word process. His suggestion was to devote near-term resources to business process improvement, and when XML was ready to happen, it would. The team had been working in Word for a long time, and to their credit, they had a fairly well developed set of templates and automation (macros and other Visual Basic® tools) intended to keep their content clean and standardized. What they needed was additional visibility and leverage.

The first step was to organize the data, a step as simple as configuring a top level “Title” object (title, in their parlance, meaning publication) with metadata and importing Word components from directories and subdirectories as they existed on the file system. The result was a collection of titles consisting of parts and Word objects that represented a section or perhaps just a page range within a section (the break point could be arbitrary). Then, some tools were built in Visual Basic that integrated the check-out/check-in process (using the CMS API) with the existing Word automation (Word API). For example, at a point in the workflow when it was necessary to strip out unwanted formatting, authors launched a tool that checked out the object, ran the macro, and upon check-in, updated metadata indicating that the process had been run. With metadata, workflow, and custom reporting, editors were able to glean at any time what documents had been updated in a workflow and what their states were. They could also selectively send files downstream to the typesetter or post on the Internet.

While this particular case study might not appear at first glance to be a quality story, it did in fact result in a repeatable business process and reduced variance. Eventually, the bridge from Word to XML was made possible, but it was done downstream of authoring. None of the editors had to worry about it. Since there were over 100 editors, the cost avoidance of XML training and application software was considerable. In a similar vein, one might envision integrating terminology checking with a CMS workflow and reporting. The technology to do so is straightforward. The thought process about what to capture and what to measure is the hard part.

Project Management

DITA has made life easier in many important ways. As the minimum revisable unit, a topic is a topic. Topics are independent of output. The DITA spec provides linking, personalization, and other mechanisms that people can confidently use. On the other hand, DITA probably increases the size of the problem being managed due to the sheer number of topics. In my Air Force days, I would have referred to this as “increased ops tempo.” Whereas in the DocBook generation, we would typically send a writer a book with sections to update, with possible reuse implications, DITA has many more topics going in a multitude of directions, with the added complexity of being handled by virtual teams.

Greenberg_Figure 2

Figure 2: Check-out from CMS (left) and launch of macro (right)

S1000D, which is a topic-based architecture used widely in military and aerospace systems, adds an additional construct of project completeness. Completeness is defined by creating a Data Module Requirements List (DMRL) during the planning phase. During execution, data modules that are complete can be compared to data modules required, and when they match, the task is finished. A challenge in S1000D, therefore, is creating interfaces and tools to manage the DMRL, which is another value added by a CMS. While DITA does not require a DMRL or any other metric for project completion, perhaps the user community should consider whether exploring and recommending some sort of best practice would be useful.

Figure 3 shows a prototypical interface that places a number of functions in the hands of a project manager: querying for topics, aggregating topics for a work package, creating topics, providing instructions, creating and routing the work package, and others. While the interface may or may not reflect how users want to use the system, the point is that a CMS is likely to provide a better framework for getting started than a file system or a source-code control system.

Greenberg_Figure 3

Figure 3: Prototype project management interface built on CMS framework

Summary

My intent in preparing this article is twofold. The obvious one is to stimulate interest in organizations that are not using a CMS to consider implementing one. Again, there is no doubt that with care, you can get along without it. The issue is whether you can do better with a CMS and whether it provides sufficient ROI. The second purpose, aimed more for people actively looking at solutions, is to make you aware that there is much more than meets the eye. Any demo that you see at a tradeshow or on a vendor call should only be the beginning of a conversation leading to a selection. Often times, the biggest bang for the buck comes from things that were never envisioned in the initial requirements set. It’s best to take a long view! CIDMIconNewsletter

About the Author

HarveyGreenberg-Xy_bw

Harvey Greenberg
XyEnterprise
harvey.greenberg@xyenterprise.com

Harvey R. Greenberg is XML Evangelist for XyEnterprise, where his role spans implementation, product development, product marketing, sales, and education. He joined XyEnterprise in 2000 as part of the Contenta CMS launch team and since then, has done over a dozen implementations using XML, FrameMaker, Word, and other technologies in both Government/Defense and private sector organizations.
Mr. Greenberg came to XyEnterprise from Standard & Poor’s DRI, where he was manager of the award winning U.S. Industry and Trade Outlook project with McGraw-Hill Professional Publishing and the U.S. Department of Commerce.