Home/Publications/CIDM eNews/CIDM eNews 08.18/Getting Valuable User Feedback via Surveys

Kathy Madison, Comtech Services

Customer surveys are just one of the ways of getting valuable user feedback, but how do you get the best results. During the July CIDM managers roundtable discussion members shared their experiences with conducting documentation specific customer surveys, highlighting what worked and what didn’t work. Surprisingly, only a handful of member companies on the call have conducted surveys specifically focused on their technical content. Those that had, admitted their first attempts weren’t as successful as they had hoped causing them to adjust their surveys over time.

The most common mistakes made early on were asking questions that have no actionable follow-up activity or asking questions that were not specific enough to be useful. For example, one member’s survey asked: Did you read the documentation? Yes or No. For the next survey, they decided better question was: If you didn’t read the documentation, why not? This allowed them to understand if the customer already knew the content, felt the content was too hard to use, or couldn’t find the documentation. Another member said they used to ask users to rank the findability of their content, which let them know there was an issue with findability. From there, they got more details by asking: What specifically, would improve your ability to find the content? Instead of an open-ended question, they provided answers, such as, Better page titles or More faceted search. This re-worded question allowed them to have very specific things that could be acted on and it took away the brain work from the customer.  This same member said every time she writes a survey question, she gives it the litmus test of “what am I going to do with this data?”, if she doesn’t know she doesn’t ask the question.

During the discussion several great tips were cited regarding survey writing:

  • Keep surveys short and simpler.
  • Stay focused and unbiased.
  • Ask a combination of yes/no, multiple choice, scaled/ranking and open-ended questions.
  • Add logic to surveys, in particular, if someone responds positively to a question about a particular area of interest, automatically skip over follow-up questions that go into details about how to improve that area. One member got five times more meaningful responses when she only asked for comments if the user gave an area a low rating.
  • Give participants an honest estimate of how long the survey will take them to complete. Many survey tools, like Survey Monkey, give you an estimate but it’s a good practice to  ask several internal people to take the survey to verify the timing. In addition, if your tool provides timing data for each respondent, you can adjust your time estimate based on the actual times it took your first few respondents.
  • Consider asking demographic questions so you can slice the data in more meaningful ways.
  • Don’t use two-part questions that only have one response option. For example, don’t ask: The text and diagrams are easy to understand? Yes or No. What if the user feels the text is easy but the diagrams are not? It is better to separate questions or have more responses such as, Both the text and diagrams are easy to understand, Only the text is easy to understand, Only the diagrams are easy to understand, Neither the text or diagrams are easy to understand.
  • Provide an option to “opt-out” of a multiple choice, such as, None, Does not apply, No opinion.
  • Don’t take up survey real estate for things you already know are issues.

A few members had the luxury of conducting surveys at their user trade shows or conferences. In these cases, providing an incentive to participants was critical. At one event, a member had 3 booths where people could take the survey, the one with no incentive had very little traffic while the one that gave a gift to every participant was much busier. Raffles are also a popular incentive. Gifts ranged from branded reusable water bottles (given to all participants), Google Home devices, GoPro cameras, gift cards and product discounts. To get people to their booth, members took advantage of social media; tweeting folks to head to the documentation booth to be entered into a drawing and take a survey.  From Mark Forry’s 2016 Best Practice conference presentation: “The average response rate for non-incentivized review requests is around 0.1%.” (from www.ventureharbour.com). If you are planning on conducting a survey at a trade show, it was recommended that you pass out business cards with a QR code that contains a link to your survey, information about the incentive you are offering, and how long the survey will take to complete. Alternatively, have tablets available in your booth for customers to take the survey on the spot. An advantage of having the customers in your booth is that you can watch their reactions to the questions and engage in conversation to gather valuable anecdotal data.

Here are a few example survey questions:

  • The content I need is easy to locate? Agree or Disagree. If the results are negative, you might re-organize your content or revisit your taxonomy strategy.
  • The content contains unnecessary information? Agree or Disagree. If users agree, you might want to apply minimalism techniques.
  • If you could change one thing about the content, it would be? This will give you an idea of where to focus your priorities.
  • What is your experience level with using our products? New user, Occasional user, Seasoned user, Past user. Demographic questions like this help you do better cross-question analysis.

There were mixed opinions as to whether you should localize your surveys and whether to keep asking the same questions from year to year, so you can track improvements or if you should look for trends in an initial survey and then dig deeper into that trend in your next survey. Conducting follow-up phone interviews with 6-10 people is a great way to get additional information from the users. In the interview, keep the structure of questions the same from interview to interview and try to personalize the interview based on the participants responses. It was noted that interviewing more than 10 people will not reveal additional trends since a pattern in the responses usually appears by the 5th or 6th interview. Regardless of if you are conducting your surveys online or at a trade show, all agreed that surveys should be limited to less than 15 minutes and interviews no longer than 30 minutes.

Speaking of surveys, CIDM is currently conducting its annual benchmark study. This year’s topic is the use of metrics and we will share the results at the Best Practices conference in September. Hope to see you there!

We use cookies to monitor the traffic on this web site in order to provide the best experience possible. By continuing to use this site you are consenting to this practice. | Close