Definition of Done for Documentation Teams
Kathy Madison, Comtech Services
There are a few definitions of done: past participle of do; completed; cooked sufficiently; worn out; doomed to failure, defeat, or death and even used up. The October CIDM managers roundtable discussion focused on the challenges and best practices associated with the definition of done as it relates do documentation and how it effects team productivity and content quality. Only a small group of managers participated in the session, but they represented documentation teams who were developing content in both agile and non-agile environments.
Regardless of the environment, members use a list of criteria which must be met before they consider their content to be done. This checklist varies slightly from company to company, with some focused on technical accuracy while others are looking at whether the content is providing the best user experience possible.
One company said their content only goes through a technical review by their scrum teams or, in some cases, a wider audience but they do not go through an editorial review. This company is not able to scrub the content as much as they’d like; ideally, they would like review for clarity, consistency and relevance. To address part of this issue, they plan to use Acrolinx, to check for adherences to a style-guide, which will help from an editorial perspective.
Another company, who’s in growth mode, uses product managers, writers, and editors in their review process to ensure the content is done. The product managers are looking to make sure the new features are documented appropriately. Each writer is expected to review their own content against a checklist prior to meeting with peers for a Documentation QA Day, where they review each other’s content. In some cases, especially with newer writers, their content is also reviewed by an editor.
The companies on the call provided several examples of their done checklist items:
- Does the content adhere to the style guide?
- Are their typos or spelling errors?
- Are the links valid?
- Is the appropriate metadata assigned?
- Is the content right for the audience?
- How translatable is the content?
- Does the content adhere to the content strategy and is there “one-voice”?
Who has the final say as to when the documentation is done? Often it is the product or project manager, but in one company, who’s in an agile environment, several people have to sign off before the document is considered done—the main engineers assigned to the features, the QA person assigned to validate the information and a peer writer. In another company, that is part agile, there is a bit of negotiation among the writers, engineers and project managers to determine what’s considered done. For example, if the engineers don’t approve the content from a technical point of view or if the writers feel the user experience won’t be good, the content must likely won’t be released.
Is all content treated the same when it comes to being done? Not for all the CIDM members on the call. One said that for technical accuracy, all topics are treated the same but for style guide adherence, the standard for internal facing documentation isn’t as high as it is for customer facing documentation. Another said even though they author in XML, they review the PDF output more closely than their HTML output. For an agile team, text in the User Interface is reviewed by the whole scrum team but content on the help page may only be looked at by the documentation team. Someone also mentioned that their release notes go through a different review process than their administration guides.
How does the concept of minimal viable content (MVC) fit in with the definition of done? Most on the call are still trying to figure this out. Several companies do their best to work with the product managers to determine the most important things customers will need and what should be emphasized in the documentation. Often the non-viable content gets planned for a minor release or never gets addressed because the team has already moved on to the next feature or release. It’s a goal of one member of the call to figure out what is the optimal amount of content that allows their customers to be successful with their products and how can writers and engineers align efforts to ensure customer success.
We wrapped up the call by discussing whether there really is a difference in the definition of done for agile versus waterfall environments. We agreed, the definition of done doesn’t differ, what differs is just the process you take to get there. If you are practicing agile, getting scrum teams to include documentation into user stories can be a challenge. When documentation is not included in the definition of done, often the documentation is released after the product, which the writers stress about and the customers suffer. It’s interesting to note that in our 2016 CIDM Agile Survey, of those members developing in an agile environment, fifty-seven percent said documentation tasks must be completed before the story is done, but… eighteen percent said documentation tasks don’t need to be completed for the story to be done and twenty-five percent said their documentation tasks are not even included in their scrum team’s user stories.
One member summed up the meeting perfectly… “Agile has helped make the documentation be built into the process and not considered an afterthought at the end of the cycle. When you have scrum masters and product owners who make documentation a part of the process, there is much more focus on the documentation. It’s great for us!—we are now an integral part of the team.”