The Value of Conducting Client Surveys and How to Get Started

Home/Publications/Best Practices Newsletter/2010 – Best Practices Newsletter/The Value of Conducting Client Surveys and How to Get Started

CIDM

August 2010


The Value of Conducting Client Surveys and How to Get Started


CIDMIconNewsletterChristie Bozza, ADP, Inc.

On face value, it seems easy to answer this question. Because we want to know what our clients think about the applications, products, or services that they are using. Overall, do they like them? Are they able to complete their tasks? What types of assistance or documentation would help them more? Is the software easy to use? Can they find the options that they are looking for on the user-interface? The list goes on.

However, the main ideas boil down to a few basic facts; do clients like the product and does it suit their needs?

So, What are the Basic Steps involved to Building and Assessing a Survey?

There are many steps, but by breaking it into the following stages, it is easier to get started.

  1. Ask yourself questions; who is the survey for; why are we doing this; how can we improve their experience; what is the best survey method?
  2. Write and edit participant questions.
  3. Use the survey tool.
  4. Collect the data and present results.

Ask Yourself Questions

First, you have to define the purpose by asking:

  • Who is the audience that will be responding to the survey?
  • Can the user find the necessary information to complete the task/s?

Next, determine the needs of your business partners; you can meet to better define their objectives. Once you know the type of information important to the business, you can ask user-focused questions that will provide sought-after data. For example, based on client feedback they are receiving, they may be interested in polling users on a particular software feature to see if a majority of users are encountering the same issue.

Some questions to ask business partners include:

  • Why are we conducting this survey?
  • What type of specific feedback are we looking for?
  • Is there a requirement to solicit feedback on a certain feature or does it pertain to overall client satisfaction?

Before writing questions, it is also helpful to ask yourself about the clients and survey methods. Following are ideas to help you get started:

  • What types of clients will I be surveying and are there common characteristics?
  • To which type of product or service do the questions refer (for instance documentation, online help, user guides, videos, and so on)?
  • Is the feedback you are asking for related to task assistance or is it scenario based?
  • For which deliverable do you have the greatest chance for a high response rate? (For example, if you have data showing that certain online help topics receive more user hits than the same type of material in a user guide, you could include the survey in the online help. This placement will increase your chances of participants seeing and responding to the survey.)

Write and Edit Participant Questions

Here are some tips to keep in mind as you write your questions.

  • Word use matters. For example, if you are surveying clients in your business who are professionals, you want to use proper grammar. Also, avoid using colloquial jargon, like kid instead of child. Colloquialisms can undermine the importance of the survey.
  • Phrasing questions can be an art form in itself. Consider these three questions and the different types of answers they elicit.
    • Do you like the software?
    • Does the text on the user interface increase your understanding of this feature? (agree, agree somewhat, neutral, disagree somewhat, disagree)
    • How does this version better help you complete your daily job functions?
  • There is no writing like re-writing, a truism that especially holds true for survey questions. Very often, it takes several revisions to be sure you are asking users what you really want to know.

Once you individually proofread for grammar, technical accuracy, and business intent, then you can send a formal review. It may be necessary to conduct additional reviews with your business partners or management to incorporate necessary points of view.

Use the Survey Tool and Choose Question Formats

There are many survey tools that are relatively easy to use and that you can learn quickly. Several offer their software on a trial basis at no cost for 30 or more days. Once you have chosen a tool, you will need to understand the question formats that you can use.

Depending on the survey software you use and the questions you ask, there are a variety of question formats from which to choose. Some common formats include:

  • Multiple choice, which is good if you want to guide and limit the participant’s answer to one of several choices.
  • Rating scale is one of the most commonly used options for mini-surveys, usually giving the participant a range of options to choose from. Commonly used ratings include options from strongly disagree to strongly agree. Since options are a quick and easy type of question, participants are more likely to answer right away. If this question type is used alone, there is a high rate of completion and the surveyor can collect the data quickly (see Figure 1).
  • Matrix formats allow a large amount of information to be included in both rows and columns. This format is ideal if you have standardized questions you would like participants to answer for a variety of features.
  • Textbox formats are ideal if you want to see feedback in the words of the participants themselves. You can also use them in conjunction with other question formats if you want to provide a space for clarification or additional information.

Bozza_Figure1

Figure 1: Rating scale

Decide How to Distribute your Survey

There are different ways to distribute a survey to clients. You can create a database list with your business or marketing partners if you want to distribute via email to 100 users or more. If you use this method, you should download the survey as a hypertext link to include in email text. It is suggested that you also use a “Do Not Reply” mailbox. You could also create an internal distribution list in Outlook for a survey of less than 100 users.

You may also want to include the survey code or link directly into a client deliverable such as a document in PDF format or in online help. Once you know the method of inclusion, you then need to “open” the survey in the software itself so that the database can begin collecting responses.

Note: If you are sending clients a survey with an email, it is a good idea to give participants a time to complete the survey. It is suggested that you don’t give users longer than one week, since people tend to forget or lose interest if it goes on longer.

Collect Data and Present Results

Once the survey duration has passed, the surveyor can close the survey and pull the data. Most survey tools separate the data for each question with both numerical data and charts or graphs for visual dashboarding. If the question was a text field, a list of each participant’s feedback is provided. These comments can be tallied by hand into groups and analyzed. Keep in mind tallying is very time consuming but often yields important general trends in user feedback.

After analyzing the data, the surveyor can prepare a findings report listing each question and corresponding set of results. You can also supplement each question with a visual chart, where applicable. The most important part of the findings report is the executive summary, which lists the high-level findings. You can also include suggestions for improvements based on participant feedback.

How do You Know if You Have Made an Improvement?

Right off the bat, you should feel good that you have attempted to reach users. By doing so, you send a powerful message that you care about what participants think and that you want to use their ideas to improve the user experience in the future. You create a positive perception of the improvement process.

You can analyze and track user feedback according to trends. With a dedicated strategy, you can also improve the product/service and see how user feedback actually begins to shape the improvement trend line. For example, many companies track progress by the number of online hits for a given topic or page. If you analyze this information in relation to the data derived from both initial and follow-up surveys (to create a baseline), the longitudinal information then becomes a reliable benchmark of trends. In the book, How to Conduct Surveys (A Step-by-Step Guide), Arlene Fink notes that “A trend design means surveying a particular group (sixth graders over time (once a year for three years)…You are assuming that the information you need about the sixth graders will remain relevant over the three-year period.”

Is it Worth it?

The answer is a resounding yes, that is, if you care about what your clients think. Realistically speaking, if you locate and implement only one usable piece of feedback per survey, you have made a positive change. Over time, as you improve methods and implement surveys on an ongoing basis, you can increase user satisfaction and even completely change their perceptions, hopefully for the better.

In 2002, Microsoft Corporation identified global leaders in customer, partner, and employee satisfaction by conducting extensive benchmarking research to learn more about the policies, processes, and traits that would enable them to inspire such loyalty.

The study uncovered a set of the following success factors that were common to the top satisfaction leaders whose businesses were most relevant to Microsoft:

  • A culture of accountability to customers and partners
  • Effective listening and responding
  • Broadly perceived product value and innovation

Later that same year, senior executives and leaders throughout Microsoft also unified all customer and partner initiatives into its current CPE strategy. Following this unification, the software giant saw steady gains in customer and partner satisfaction year-over-year.

In a study by Wagner Edstrom Worldwide about Microsoft and its outreach to business partners using surveys, it was noted that in “2006, in part through product improvements and customer and partner outreach, we {Microsoft} achieved the highest customer and partner satisfaction levels since beginning our global surveys”. CIDMIconNewsletter

 Bozza_ChristieChristie Bozza

ADP, Inc.

christie_bozza@adp.com

Christie Bozza has been a writer and information development writer since 1993. With a Master’s degree in Creative Writing and a Bachelor’s degree in Critical Writing, Christie worked as an adjunct Professor of English/Writing for William Paterson University and a writer for National Pharmacies (Merck) and Ernst and Young. She has been employed as a Senior Technical Writer at Automatic Data Processing, Inc. (ADP) since 2006.

REFERENCES

Arlene Fink

How to Conduct Surveys (A Step-by-Step Guide)

2006, Thousand Oaks, CA

Sage Publications, Inc.

ISBN: 141291423X

Waggener Edstrom Worldwide

Customer and Partner Experience: Increasing the Satisfaction of Microsoft Customers and Partners

2006, General News Media Inquiries

We use cookies to monitor the traffic on this web site in order to provide the best experience possible. By continuing to use this site you are consenting to this practice. | Close