Creating a Customer Information Program: Nurturing customer relationships and leveraging feedback

Home/Publications/Best Practices Newsletter/2011 – Best Practices Newsletter/Creating a Customer Information Program: Nurturing customer relationships and leveraging feedback

CIDM

June 2011


Creating a Customer Information Program: Nurturing customer relationships and leveraging feedback


CIDMIconNewsletterDawn Eisner, NetApp, Inc.

Collecting customer information is critical to the content development process, but the data gathered can also drive organizational and enterprise-level knowledge management changes.

How the NetApp Customer Information Program Began

NetApp senior management created a position that would help content developers within the product documentation team (Information Engineering) get a better understanding of customers’ and partners’ technical documentation requirements with the goal of improving technical documentation content.

NetApp already had several ways for employees to connect with customers directly, but some of the programs involved a non-technical customer audience and Information Engineering’s (IE) audience is technical. Some programs had a more technical audience, but they had closed participation to only the Marketing and Product Management functions. While gathering information about customers from these programs, we had to create other ways to interact directly with customers.

Identifying Existing Feedback Avenues to Get Started

Feedback was already available from many avenues, but content developers were not fully using it or could not access to it efficiently and effectively.

  • We had a feedback email address in all of our published documents, but we were not routinely responding or tracking the actions taken or reviewing feedback for issue themes.
  • Support offered an opportunity for content developers to sit with representatives as they took customer calls. This opportunityenabled content developers to see when and how Support used the documentation to resolve customer situations. However, the IE team had globally located members, and Support was located in only one site at the time. The opportunity was more effective when the two representatives were co-located.
  • Senior Support representatives met quarterly with product operations representatives to discuss customer top problems, but IE was not included in these meetings and often heard about generic documentation problems months later, often too late to investigate and respond appropriately.
  • Field personnel regularly complained to their contacts within NetApp about content issues, but by the time content developers received the message, the problems were not clear enough to take action.
  • NetApp already had a corporate Customer Listening Program and conducted an annual customer survey. The survey results for documentation were unsatisfactory. The team managing the corporate program wanted the IE Customer Information Program to investigate and create action plans to address the root cause of the discovered problems.

All of these avenues gave IE some specific usability and accuracy feedback that would improve the existing content, but some customer improvements required participation and change from groups outside of IE. For the program to make effective use of any of this feedback and to make changes in other groups’ processes, we needed to develop proactive internal procedures and open cross-functional communication.

Building Relationships and Credibility with Documentation Users

Because IE had not been consistent in any one approach to user feedback, it was necessary to build credible relationships with both internal and external feedback providers. Developing better relationships meant creating processes and responding to feedback in a consistent and complete way. The four things we needed to do to earn back the trust of our customer community were

  • Acknowledge the feedback—always!
    Nothing defuses negative feedback like the willingness to really listen and acknowledge the complaint. An appropriate response requires empathy and a certain detachment from the content, but it is one of the most important things to do well. Good content developers should be able to put themselves in the customer’s situation and imagine what that customer might experience with the product or the instructions they have provided.
  • Determine how to address the issue and validate the fix with the feedback provider.
    Asking users if the action taken addressed the issue adequately gave us another chance to make sure we were addressing the right concern in the right way so other users could avoid the situation.
  • Fix the issue.
    Follow-through builds credibility.
  • Tell the feedback providers their issues are closed and thank them for their time and effort.
    A closed-loop process builds credibility, the thoroughness inspires confidence, and expressing sincere gratitude shows customers you appreciate their time and effort.

These are the cornerstones of any good customer program.

Creating Processes that Support Successful Feedback Follow-Through and Metrics

IE developed a process in which management acknowledged and responded to the feedback within a short, specific timeframe. The manager thanked the provider and communicated who would be responding with more detail. The assigned person

  • Opened a documentation bug report
  • Notified the feedback provider of the actions taken
  • Verified that the actions addressed their issue
  • Fixed the problem
  • Closed the bug report and thanked the feedback provider again

When the Support team expanded globally, IE learned how difficult it was to find troubleshooting instructions within our content. IE arranged for content developers to participate in customer calls with Support representatives. Observing the Support personnel, feeling their sense of urgency to find the right answer and respond to customers’ needs under pressure, gave content developers a new perspective.

Working with Senior Support representatives on common product usability issues also helped content developers understand the importance of making certain content stand out in search results and prioritizing content that addresses the most common customer issues over other, less critical content. IE could also work with product development to include the usability improvements in the next release. Support representatives felt heard and valued by the content developers and a partnership developed between the two groups.

Our field personnel were most vocal about the real-world issues customers were having with documentation. Giving field personnel a clear avenue to provide feedback to the IE management team, and quickly acknowledging, addressing, and closing their issues, improved our reputation more broadly within and across the internal customer-facing and external customer communities.

Improving IE responsiveness to the feedback of these critical internal organizations and external customers began to elevate the value of the group. Creating processes that supported follow-through made us more responsive. Tracking the feedback loop created a consistent approach. Metrics enabled us to use the data gathered to highlight group achievements and gaps.

Investing in Corporate Customer Listening Initiatives

NetApp has a corporate Customer Listening Program team that collaborates with Walker Information, a customer strategy and consulting firm, to help design and implement the program effectively. Investing in IE representation on this team allowed us not only to help shape product documentation survey questions, but also to help other teams and functional areas like Quality, Training, IT, and Product Development understand the survey results and learn how to use the feedback to develop action plans in response. This cross-functional experience gave IE an enterprise-level picture of user documentation and presentation needs. It also placed IE in a position to represent the user experience cross-functionally, offering value in additional areas across the company.

IE followed up with the customers who commented negatively about product documentation. Their responses to the question, “What is the most important improvement we can make to the technical documentation?” revealed that about 30-50 percent of the customer complaints about technical documentation were actually related to unsatisfactory search results and content presentation on our Support site. Content was difficult to find and the search engine was insufficient to meet today’s customer requirements for quick access to documentation. Through the survey, we finally had supporting customer data to share with executives. The executives funded a program to improve the web site and invest in a new search engine.

While correcting the website problems would improve customer ability to find the documentation, we still did not have enough specifics about the customer content issues to take further action.

Evolving the Corporate Customer Listening Program and Getting the Most from the Survey

IE needed to refine the questions to get more specific and actionable feedback for the rest of the dissatisfied customer feedback. As the corporate Customer Listening Program has grown, survey refinement has become part of the yearly process.

In FY10, we created a completely new technical documentation section within the survey. We wanted to determine the specific types of content about which customers’ perceptions and behaviors were negative. The feedback provided valuable information, such as

  • which content is most used by customers
  • which tasks are customers usually performing when they look for documentation
  • when customers look for information, what type of content do they most want
  • where do customers look for that documentation
  • how do customers want to be notified when content is updated
  • what are the specific opportunities for improvement

The data from this survey helped us put the customers’ comments in context of the customers’ task, mindset, and task path, so we could see the problem areas more clearly. Armed with the details, the Customer Listening Program team recommended to the CEO and staff that we focus on, and invest in, improving the technical documentation and addressing related customer improvements with the web site. The Customer Listening Program facilitated the creation of a team with members from key author groups. That team presented a proposal for an enterprise-information architecture review, new tools and technology, better collaboration, and a request for executive sponsorship, team empowerment, and funding to mandate new standards and guidelines for content beyond product documentation. The survey data highlighted the need for an enterprise-level approach to the information we provide to customers. Enterprise knowledge includes more than just product documentation. It includes the troubleshooting content in our support knowledgebase, as well as White Papers and Technical Reports provided by our Marketing organization.

Creating Goals and Incentives for Customer-Focused Behavior

When the Customer Information Program first started, the content developers were very excited that they were finally going to get customer feedback about their specific content. They said they were anxious to understand how customers used what they wrote, but we rapidly found that defensiveness and dismissal were common reactions as if customers just did not understand how content was developed. Some lessons we learned were

  • Not all content developers are comfortable communicating directly with customers.
  • Customer issues are unpredictable. Content developers had to learn to accept feedback timing graciously.
  • Some content developers could not detach emotionally from the content they provided, and that affected their ability to listen objectively and react to feedback.
  • Even though some content developers were incorporating the feedback, the handling of feedback was inconsistent.
  • Managing customer feedback was not part of content developers’ measurable goals.

Because of these lessons learned, IE made customer responsiveness a performance review goal and researched potential training and development opportunities in soft skill areas for those who needed to improve. The level of direct customer interaction and cross-author team interaction was increasing, so new listening and communication skills were required.

As an incentive for customers to provide feedback and to make responding to feedback more fun, we created a Customer Focus award. Feedback providers and content developers are eligible to receive the award, along with an award certificate.

Feedback providers who give the most valuable and consistent feedback are often winners of the award, since we want to encourage them to continue helping us. Content developers who routinely show openness to feedback and who have a truly service-oriented attitude often win the award. We recently created a Super Star Customer Focus award for the person who helped make the best improvement to our customer content or showed exemplary customer focus over the period of a year.

In addition to the award system, we announce positive feedback about content developers on the company intranet. Complimenting team members publicly on customer-focused behavior motivates team members to model behavior that is responsive to customer issues.

An award system encouraging customer focus, team member responsiveness, development plans, and review criteria and measurement has helped develop a more customer-responsive culture within IE. We may also be raising customer awareness outside the department, as evidenced by requests for the Customer Information Program Manager to present customer feedback to product development teams.

Using Feedback to Highlight the Importance of Good Content Reviews

IE found that some of the customer content issues were the direct result of insufficient reviews and inadequate review processes. While a developer might find the documentation to be technically accurate, it was sometimes incomplete or vague when followed in a complex customer environment.

We developed a Documentation Advisory Council with members from Support, field personnel, partners, and customers who had exceptional interest in the improvement of our content. The Documentation Advisory Council serves as a review group outside of the normal technical review process and gives us a better sense of how customers and partners will receive the documentation as they apply it in a much more complex, real-world environment.

IE is researching automated review tools that would allow content issue identification and highlight the weaker review areas for improvement.

Using Metrics and Data to Drive Change

On a monthly basis, IE tracks and reports the feedback email statistics to IE senior management:

  • Percentage of feedback received and acknowledged
  • Geographical origin of feedback percentages
  • Content that receives the most feedback
  • Percentage of Internal versus external customers providing feedback
  • New, in process, and closed Bug report numbers

Tracking these data points over time helps us see changes in geographical and customer feedback trends, identify which documents get the most feedback and in what areas, and make sure we continue to respond effectively. We can use this data to understand whether we have enough resources or content developers with the right skill set assigned to the most problematic areas of content and to verify the success of our review process changes.

Data collected from all of the various feedback avenues gives us a platform to communicate areas that affect the customer experience with documentation. For example, we have evaluated the

  • Developer-to-writer ratio gaps against the industry standard, which is helping us attain new, skilled content development and illustration resources to meet customers’ changing needs
  • Need for exploration of and innovation through new technology, enabling us to create a customizable content delivery proof of concept
  • Need for content development, publishing, and web site publishing process flexibility and automation
  • Benefits of a content model based on Darwin Information Typing Architecture (DITA), where audience experience level, localization, and personalization are critical, highlighting the team’s accomplishments in this area
  • Importance of our effort to incorporate appropriate metadata to improve search results
  • Critical nature of working closely with IT on search engine optimization, content management, and presentation
  • Need for a corporate-level information architecture and enterprise knowledge management strategy

The corporate Customer Listening Program is well on its way to being an established, repeatable, and measurable corporate program with a positive impact on revenue. We have three years of data from our annual corporate Customer Listening Program survey to give us an assessment of our year-to-year ratings patterns and help identify areas of focus. Involvement in the NetApp Customer Listening Program has helped IE highlight the issues with the company’s publications that we can only address through higher-level corporate improvement initiatives.

  • NetApp executives use the results and program recommendations to determine where to place corporate focus and investment. The CEO creates a video in which he thanks customers for participating in the survey and communicates the top three issues the company will address, based on their collective feedback.
  • Business units use the data collected to create improvement action plans.
  • Sales Account teams use the feedback to address revenue opportunities or correct any potential issues with individual customer account members. Account teams close the feedback loop by letting the customer know the actions planned by the company in response to their specific feedback, as appropriate.
  • We can now show that the customers whose account teams directly acknowledged their feedback spent more with NetApp in the following year than customers whose account teams did not close the loop with them.
  • This last bullet is significant for our documentation team. How customers rate our products and related features is an important contributing factor in how loyal they are to NetApp. Customers that are loyal to NetApp invest more with us. If documentation is truly part of the product, and we follow up appropriately with customers to close their issues, then we can attribute at least some portion of the resulting increase in sales to technical content.

Summary

It takes at least three years to develop a solid Customer Information Program and ours is still evolving. If you can identify existing feedback areas, you will likely uncover process and skill weaknesses. Ensure a closed-loop process to build a credible program. Address process issues first to determine what customer skills are required and where your team may require development. Create incentives, metrics, and goals to encourage a customer-responsive culture. Show early successes to help your program gain momentum. Get involved in customer-facing initiatives, see where you can add value, and then figure out how to use the relationships to benefit your program.

Key points for a successful Customer Information Program:

  • Identify and leverage existing feedback avenues.
  • Build credibility by acknowledging concerns, fixing problems, and closing the loop with anyone who provides feedback. Remember to thank them for their time and effort!
  • Create processes to deal with feedback consistently.
  • Investigate corporate customer feedback programs and get involved.
  • Create incentives and make being customer-focused part of the group culture.
  • Look for opportunities to use feedback themes and metrics to drive change inside and outside of the content development organization.

Developing a Customer Information Program is not only critical to getting content developers the information they need to improve documentation. It can also give you the customer data you need to highlight the importance of great customer-facing technical content to your executive team. CIDMIconNewsletter

Eisner_DawnDawn Eisner

NetApp, Inc.

dawne@netapp.com

Dawn Eisner is a Customer Program Manager for the Information Engineering team at NetApp™. She has thirty years of international experience in the high tech industry, including roles in new product manufacturing and engineering, program management, customer product-line management, and technical communications. Prior to joining NetApp in early 2006, Dawn worked for EMC, where she represented the user experience on an award-winning documentation architecture team. At NetApp, Dawn manages a Customer Information Program to connect content developers with customer feedback and is a key member of the corporate Customer Listening Program.

 

We use cookies to monitor the traffic on this web site in order to provide the best experience possible. By continuing to use this site you are consenting to this practice. | Close