Improving CMS Archiving Process and Efficiency—Lean Sigma at Hewlett-Packard
In 2007, HP endorsed an organizational improvement methodology, one that relies on a common language for process improvement efforts. HP uses the term “Lean Sigma” to characterize the HP Green Belt and Black Belt programs:
- Lean refers to eliminating waste or non-value-add components of a product, service, or procedure
- Six Sigma entails a set of tools and methods for executing breakthrough improvement projects
This article provides a comprehensive overview for others wanting to learn how to carry out a Green Belt project and to convey to other industry professionals how HP supports continuous improvement to ensure high customer satisfaction.
The Green Belt process improvement project for improving CMS archiving process and efficiency employed the HP Lean Sigma methodology. Project phases included:
- Testing assumptions
- Collecting baseline data
- Redesigning the process
- Meeting or exceeding expectations
- Supporting the redesigned process
As the CMS Administrator, I saw much room for improvement in CMS archiving. The CMS versioning feature was not as fully functional as it needed to be to best support the department’s archiving process. However, because I was responsible for submitting enhancement requests to our CMS application vendor, I knew that several improvements had been made that would benefit my organization. While I had not yet tested those improvements, I believed that they would positively impact the cycle time for archiving customer content, and therefore predicted that our writers’ satisfaction with the process would improve. I decided that the best way to proceed would be to employ Lean Sigma methodologies for process improvement, with the focus to improve the CMS archiving process for better efficiencies.
A component-based content management system, illustrated in Figure 1, was the key technology assessed in this process improvement project.
Of the four phases of our content management process—authoring, managing, publishing, and localizing—the management phase was the primary focus of this study.
What I was trying to accomplish
The improvement objectives targeted in this project in the management phase included
- fewer steps for writers
- reduced cycle time
- increased customer satisfaction
- less administrative overhead
- use of the feature called “versioning” instead of “cloning”
- compatibility with content managed as components or modules
The use of versioning was an unknown variable at the outset of the project, with several options for implementation, especially in terms of its impact on a modular environment, which previously had not been supported. Even so, an assumed cost-saving would result if cycle time for writers and the CMS administration were reduced.
Multiple cycles of the Plan-Do-Study-Act (PDSA) improvement model were needed before the process redesign could begin. For each cycle, I either implemented a change or tested a change based on a plan. Upon summarizing the findings, I determined what additional changes or tests were needed in subsequent cycles.
In total, the project employed 19 PDSA cycles, beginning with testing a number of assumptions at the outset of the project.
PDSA cycle 1: Do I have the right team?
The first step in the project was to determine whether or not I had the right team. Eight members were previously selected, based on who I thought should make up the team, including managers directly involved in our content management strategy, the CMS administrator, tools support personnel, and writers representing each of our business segments. I plotted their names into a team map consisting of four quadrants representing authority to make changes and detailed subject matter knowledge. To my surprise, two of the eight team members were not needed for the project.
Figure 2 depicts the team structure with these two individuals in the bottom right quadrant.
As a tools support person myself, almost every project I had worked on at this point included our Content Management Editor and our authoring and publishing developer. I thought that including them in this project would provide a broader perspective into potential impacts to other areas of our content management process. However, the team map indicated that neither of them had detailed subject matter knowledge or the authority to make changes to the CMS archiving process. Since that was the focus of this project, I had to concede that including them would not be a value-add for them or for the project. In contrast, while neither of the managers directly involved in our content management strategy had the subject matter knowledge required for this project, they clearly had the authority over the process. Though they depend on people like to me make decisions that lead the department to achieve specific goals, they have the authority to change that direction or to override my decisions. As a result, their involvement in the project was important. The writers representing our business segments did not have the authority to make changes to the process. However, they had the detailed subject matter knowledge required to make the redesign successful.
PDSA cycle 2: What mainstay processes are linked to the current archiving process?
I developed a system linkages map to show a high-level view of the “mainstay” processes, as well as the processes driving the management of content we sought to improve: marketing, engineering, and corporate branding requirements. The activities resulting from the drivers included collecting data, creating a documentation plan, producing draft versions, producing final versions, localizing, and publishing. The project’s area of focus was midstream, M3 and M4 as depicted in Figure 3, including producing draft and final versions. As the map indicates, there were many support processes. However, the two that played a major role in the project were tools administration and training.
By analyzing these system linkages, I learned that the activities the project focused on were indeed related to other “mainstay” processes; however, they were not dependent upon them, nor did the M3 or M4 activities impact any of the other processes. This analysis confirmed that redesigning the archiving processes would not impact other areas outside of my control. In addition, while the driver processes feed into the department’s mission to provide end-user documentation for all of our products, they do not directly impact the archiving process. Finally, knowing what support processes would be required for a successful redesign was important to understand from the project’s outset. Fortunately, the two support processes that would have the most impact, S5 and S6, were completely within our control. All-in-all, the system linkages map provided a green light to continue with the project.
PDSA cycle 3: Who/what are the suppliers, inputs, outputs, and customers for this process?
I created a SIPOC (suppliers, inputs, process, output, and customers) diagram, as shown in Figure 4, to verify the nature of the customers. This process entailed identifying key inputs and outputs. In this case, the suppliers and the customers are the same. My assumption was that the legal department would be the supplier because we archive content for legal purposes. However, it became clear that the writers submit XML content into the CMS to archive older editions so that new output can be created when needed. My assumption was based on a broader view of the process. For the redesign to be successful, it was critical that I narrowed it down to a lower level of detail, ensuring I fully understood the most critical inputs into the process.
Because archiving customer content and creating new editions are closely related, I created SIPOC diagrams for both processes, shown in Figure 5. This action confirmed that the suppliers and customers were the same for both activities.
Created on the heels of the system linkages map, the SIPOC diagram provided a more magnified view into the specific processes in question, ensuring I fully understood the suppliers, inputs, outputs, and the customers specific to the process I was to redesign.
PDSA cycle 4: What causes the lack of efficiency?
I had a theory as to what caused the lack of efficiency in the archiving process. To test that theory, I used a fishbone diagram, shown in Figure 6, to analyze cause and effect aspects of archiving. I knew the problem (effect). This diagram helped to validate the causes, captured by the “bones.” Several areas emerged as outside my span of control, including policies, tools, and XML documents. However, procedures were something that I could control, so the project focused exclusively on that cause.
I knew who my team was, how the process fit into the bigger picture, who my customers were, and the causes that were within my control. I was now ready to determine measures for process improvement.
How I knew change was an improvement
Measures were based on the project’s improvement objectives:
1. Waste eliminated, resulting in fewer steps
2. Time to complete the process is less than before
3. Customer satisfaction is increased
4. Content collections are reduced by half
5. Versioning is enabled, and best practices are in place to support it
6. Revised process accommodates modularized content
With these measures in mind, goals had to be established. Some of the goals were apparent without collecting baseline data. For example, I hypothesized that implementing versioning would remove the need for both working and released collections, resulting in collections being reduced by half. I also knew that versioning would need to be configured to support modularized content, and that, in addition to configuration updates, best practices and procedures would need to be developed.
The remaining goals required the collection of baseline measures for the current archiving process, including the number of steps, cycle times, and the customer satisfaction level.
Collecting baseline data
Before specific goals could be set, baseline measures had to be collected for comparison purposes.
PDSA cycle 5: How many steps are in the current process?
Mapping out the current process provided a starting point for several follow-on cycles, including capturing baseline data for number of steps and providing a high level view so that waste vs. value-add could be captured.
I created process maps, shown in Figure 7, for both the archiving and creating new editions processes because, as previously discussed, the two processes were very closely related. Viewing them side-by-side allowed me to see the similarities. For example, both processes involve opening CMS collections, though the collections are not the same. Both processes involve dragging and dropping content, or “cloning.” Understanding the similarities helped me to look for efficiency gains in later PDSA cycles. However, at this point in the project, my focus was simply to gather the baseline data.
PDSA cycle 6: How long does the current process take?
Capturing the current cycle time provided baseline data that I later measured against to ensure that the process improvement goal of less cycle time was met, shown in Figure 8.
The cycle times varied quite a bit from user to user. In addition to providing cycle times, the users also provided information about the XML files, such as file size. There was a direct correlation between the size of the file and the time it took the user to archive it so that new editions could be created. As previously shown in the fishbone diagram (Figure 6), file size was outside of the span of control for this project. However, knowing that there would be a range of results, I concluded that I should calculate averages for data collection from the PDSA cycles directly related to redesigning the process later in this project.
PDSA cycle 7: What is the writers’ satisfaction level with the current process?
Capturing the current customer satisfaction level in Figure 9 provided baseline data that I later measured against to ensure the process improvement goal of increased satisfaction was met.
I was very surprised that the satisfaction level for the current process was this high. Because of the time involved, I expected users would be less satisfied. However, because archiving customer content only occurs when a project has been completed, the process is not used as often as other CMS processes. As a result, writers were willing to tolerate the cycle times. In contrast, they would have been less willing to tolerate long cycle times for processes they use on a daily basis, such as loading content into the CMS. This result gave me a different perspective, viewing the process from the customer’s eyes rather than from my own.
PDSA cycle 8: Whose feedback should I focus on?
Having collected baseline data, including levels of satisfaction, I felt it was important to be proactive in case I received conflicting feedback from my stakeholders. I created a stakeholder map to determine whose feedback should take priority. The map consisted of four quadrants representing involvement in the project and commitment to change.
Figure 10 depicts the stakeholder structure with those taking priority in the upper right quadrant.
The stakeholder map looks very similar to the team map (Figure 2). However, the quadrants represent different qualities, ranging from high to low. Analysis of the map showed that the managers directly involved in the content management strategy must be kept informed on the progress of the project and any decisions resulting from it. For the CMS Administrator, I had to address any concerns such as CMS configuration. The writers were enlisted to help redesign the process. As a result, their buy-in was key. Their feedback took priority and would ultimately drive the direction of the process redesign.
Once baseline data was collected, more specific improvement goals were established as shown in Table 1.
Redesigning the process
With the goals in mind, we were able to begin the process redesign. In my opinion, this is where the real work began.
PDSA cycle 9: Which parts of the process are considered value-add vs. waste?
Customers were surveyed to learn how they felt the process could be improved, shown in Figure 11. From that, I learned that writers wanted benefits from the CMS, including content that is accessible, valid, and archived, with new editions easily created. I also learned that they are not willing to invest more time, but rather, expect the process to take less time. Steps not associated with those needs were considered non-value add, or waste.
For both the archiving and creating new editions processes, non-value-add steps included opening multiple CMS collections and moving or copying content. The creating new editions process included the additional non-value-add step of updating the client key, or ID attribute, of the content, so that a separate edition could be created. In PDSA cycle 5 (Figure 7), I noted similarities between the two processes. Creating the value stream map confirmed my suspicions that potential efficiencies could be gained by eliminating this redundancy. Before I could be sure, I needed to consult the subject-matter experts.
PDSA cycle 10: How can waste be eliminated?
For this cycle, I consulted with subject-matter experts, including the writers, CMS vendor support, and the CMS administrator, and show the current process in Table 2. Based on those discussions, I decided to implement basic versioning within the CMS. This process resulted in fewer collections to open (WIP and Released) and no cloning (dragging and dropping) required.
In reviewing customer comments, several requirements emerged. These requirements, including faster response time, automation, and no duplication of content, could be achieved by implementing basic versioning in the CMS. Versioning would rely on overwriting content within the same collection. Also, by implementing versioning, users were given the option to label versions with information they felt was relevant. We then tested the success of these changes with additional PDSA cycles.
PDSA cycle 11: Are there fewer steps in the revised process?
Process steps were decreased to a total of four steps outlined in Figure 12. This decrease met our goal (see Table 1).
Implementing versioning removed the need for both WIP and Released collections. By overwriting content instead of duplicating it, unique client keys (ID attribute values) were also no longer a concern. Eliminating those activities reduced the total number of steps in the process, even though new steps were added for opening and labeling the version. Even though we had met the goal of reducing the number of steps to four, it was necessary to continue testing to determine if the remaining goals had been met.
PDSA cycle 12: Is the revised process faster than the original process?
Data collection showed that the new process took 5.2 minutes average, for an 80 percent average improvement, shown in Figure 13. This improvement brought us much closer to the goal of 5 minutes.
The bar chart indicates that implementing versioning normalized the process cycle time more than we had seen previously. Versioning eliminated the need to drag and drop (clone) content from one collection to another. While implementing versioning drastically reduced the cycle times, there was still room for improvement.
PDSA cycle 13: What was the writers’ satisfaction level for the revised process?
I used the same customer survey as before, with satisfaction levels increasing to an average of 4.7, shown in Figure 14. While I was much closer to my goal of 5, I still had more work to do.
As a result of the revised process, satisfaction levels increased across the board. This increase indicated that all customers felt we were moving in the right direction.
Meeting or exceeding the goals
While the process redesign showed much progress, I still had not fully met my goals. For the next several cycles, I focused on further improvements by repeating several cycles but against the revised process.
PDSA cycle 14: How can the revised process be improved?
Once again, I met with subject-matter experts, including writers, CMS vendor support, and the CMS administrator, to get their input on which parts of the revised process were still a concern. Feedback from those sessions included leveraging the CMS right-click menu to more efficiently create versions on demand. Also, supporting modularization was a requirement for this project that had not yet been met. These findings are shown in Table 3. Based on this additional customer feedback, several new requirements emerged. They seemed to focus on support activities, including documentation updates and training more than the process itself. However, it was clear that the labeling process was still in need of further improvement. Even so, the fact that most of the feedback focused on non-process activities was an indicator that we were on the right track.
PDSA cycle 15: Are there fewer steps in the revised process?
With the changes implemented from the previous cycle, the process was further revised. The result was a 73 percent reduction for a total of only 3 steps (shown in Figure 15). This exceeded the goal of 4 steps.
TIP: Process improvements can be made beyond initial expectations, so don’t limit yourself.
Supporting versioning for modular content involved automating the labeling of those versions. When a book was versioned, references to modules were also versioned, and the same label was carried over to show the relationship between a book and its components. Because we leveraged the CMS to do this action for the writer, no additional steps were involved. Additionally, by accessing versioning from the right-click menu, version history could be opened and a version created and labeled in a single step.
PDSA cycle 16: Is the final process faster than the original process?
When I compared cycle times for the final process, I found that there was a 95 percent improvement, with the cycle time taking an average of only 1.4 minutes, shown in Figure 16. This improvement exceeded the goal of five minutes.
When the previous cycle times were charted for the initial process redesign, I saw that the cycle times for the archiving process had somewhat normalized (see Figure 13). With the additional changes implemented to create the final process, cycle times were normalized even further as a result of the process being as simple as possible.
PDSA cycle 17: What is the writers’ satisfaction level with the final process?
Previous cycles showed that I was meeting or exceeding the goals. Measuring customer satisfaction levels let me know if I had indeed arrived. Survey results showed a 26 percent increase from the original survey, with a perfect score of 5 (very satisfied) across the board shown in Figure 17. The goal had indeed been met.
Supporting the redesigned process
With all goals having been met or exceeded, the final phase was to put things in place to support the new process.
PDSA cycle 18: What best practices are needed to support the redesigned process?
Once again, I consulted with the subject-matter experts (see Figure 18). Based on their input, best practices were developed for
- creating a version
- labeling a version
- accessing version history
Customer feedback indicated that FAQs (Frequently Asked Questions) should precede the documented process. Because my organization has limited training resources, FAQs are critical. They provide a framework for the process, rather than expecting writers to just follow the steps in order to achieve the desired outcome. We often receive such feedback from our writers. It is important for them to have an understanding of the concepts and “what ifs” before they can fully digest the process itself.
While I was able to automate and reduce much of the archiving process, some components remained user-driven. Best practices provided the guidelines to ensure that all writers were managing versions using consistent methods.
PDSA cycle 19: Can any CMS collections be eliminated as a result of the redesigned process?
Based on the process redesign, the CMS Administrator was able to reduce collections by 50 percent, for a total of 10 active collections. Released collections are now used for historical purposes only, and therefore, no longer require configuration updates. This saves administration time and decreases overhead. Work in Progress (WIP) collections are now the only collections used, requiring configuration updates as needed. Collections were renamed to support this new strategy (see Figure 19).
Even though content is archived just a few times each year, improving the process has had very positive impacts that were really worthwhile. Implementing versioning has improved the archiving process not only for writers but for the CMS Administrator as well. Additionally, this process improvement has positively affected the bottom line of our organization (see Table 4).
Improving the efficiency of the CMS archiving and new edition process for customer source content created several benefits.
- Reductions: cycle time 95 percent; number of steps 73 percent; collections 50 percent
- 26 percent increase in customer satisfaction levels
- Less administrative overhead and versioning of modular content
Return on Investment
In addition, cost savings were realized. Savings were measured by Net Present Value (NPV), Table 4, which calculates labor and materials.
NPV for this project focused on time savings, project costs, and the project duration. There were a total of 68 CMS users. Of these, 18 users did not archive content, including Managers, Leads, and Editors. That left 50 users who archived content, plus 1 CMS Administrator. The project team included 5 users representing each of our business segments. The average cycle time before the process redesign was 52 minutes. The average cycle time after the process redesign was 1.4 minutes. To help determine NPV, the number of documents per release and the number of releases per quarter were calculated. Additional data was pulled from our project tracking database for work impacting customer content collections. Since those were reduced by half, we divided the time by half. This data was used ultimately to determine the number of hours saved per day per user (for 51 users).
Several key learnings resulted from the project. It is important to ensure you have the right team from the outset. Use available Lean Sigma tools to validate any assumptions. Having quantitative proof of improvement is empowering. Instead of communicating general improvements to management, you can provide statistical data to better illustrate the impact. Taking the time to collect and analyze the data is worth the extra effort and may result in exceeding initial expectations and goals.
About the Author
Stacey Swart is the Content Management System Administrator and Strategist at the ESS division of Hewlett-Packard Company (HP). Stacey has over 16 years in the tech industry in areas ranging from technical support to technical communication, and is certified by HP as a Lean Sigma Green Belt. She holds a BS from the University of Kansas in Education and English.