CIDM

October 2012


Using Web Analytics for Documentation


CIDMIconNewsletterMark Hoeber, IBM Corporation

Web analytics is the measurement, collection, analysis, and reporting of internet data for purposes of understanding and optimizing web usage (Wikipedia). Through free tools such as Google Analytics or high-end enterprise solutions such as the IBM products our team documents, we can access a vast—and often overwhelming—amount of data about how our customers use our web content.

As information developers, we have a great opportunity to use web analytics data to better understand our customers’ behavior and needs. However, focusing on the right data, and reaching actionable conclusions, is a challenge. Metrics, or KPIs, typically used to measure web site usage are of limited value. Often we can’t interpret what the values or trends are revealing, and we’re not sure what makes a report a positive or negative indicator. Despite the difficulties of applying web analytics to documentation, if we dig a little deeper we can find opportunities to better understand our customers and to improve our ability to help them succeed. This article provides an overview of the challenges our team has encountered, and a discussion about some of the ways we’ve used web analytics to do a better job.

Background

We deliver documentation for IBM’s Enterprise Marketing Management products on an internet site that is accessed through the products’ user interface. We post HTML, PDF, and Flash documentation on this site, and customers access the right content through context-sensitive links. Our mission is simply to provide the right content at the right time to enable marketers to successfully do their jobs.

We process the logs from this site using one of the products we document, IBM Unica NetInsight, to analyze our users’ behavior. The following ideas are from our ongoing quest to use our own tools to better serve our customers and reach our goals.

Questioning Assumptions in Data

Web analysts often start with some basic metrics, such as

  • number of unique visitors
  • number of repeat visitors
  • average page views per visit

If you’re marketing a product or service through your web site, you usually want to see these numbers sloping up: The more the better.

But what about for our site, which aims to educate and answer user questions? Is more necessarily better? Let’s look again at the metrics above, through the lens of our own mission.

  • Number of unique visitors: Yes, we want to justify our salaries by having people use our documentation. And if this number is increasing, it may mean a growing customer base or that users get value from reading our content, either of which is welcome. On the other hand, it may mean people are increasingly stuck when using the software and need help, which may be a troubling indicator for our business. So, a 20 percent increase in unique visitors isn’t automatically a positive (or negative) indicator.
  • Number of repeat visitors: If more users are coming back within a defined time period, they must like what we offer. Or, they may not be getting what they need the first time and trying again. How do we know? Again, the number or trend doesn’t necessarily tell us how we’re doing.
  • Average page views per visit: It’s nice to think that users find our content interesting and valuable and that they want to follow links and continue reading more. But what if they’re viewing multiple pages because they can’t find what they need and are painfully searching for the right answer? They may in fact be quite unhappy clicking through our documentation. So we don’t really know if more clicks are a good sign or not.

As you can see from these examples, when analyzing how people use documentation, the meaning of the metrics can be ambiguous. When selling something, we aim for higher values. But when the goal is increasing customer success, higher may not always be better. So we need to start by focusing on what our goals are, not just on the data the software is giving us.

Measuring Goals

Of course marketers look for more than increasing counts. Traffic without purchases is just cost without revenue. What matters more is a conversion rate. A conversion rate is the number of site visits that achieved a specific goal divided by the total number of visits. (Thus you can see that disproportionate effort generating unqualified traffic would lower the conversion rate.)

Some common goals factored into a conversion rate are

  • completing a purchase
  • providing your contact information
  • sharing content on a social network

Marketers can configure their web analytics tools to make tracking a conversion rate easy.

But what about those of us analyzing a documentation site? What specific actions can we count as conversions?

Again, unfortunately, there’s no easy equivalent to making a sale. Our success rate is more difficult to measure. Sure, many sites ask if the page you’re on was helpful, and a click of the “Yes” button can certainly be set up as a conversion. But should both an explicit “No” and no response be considered the same? Considering typical low response rates, probably not.

So what to do? It seems that what we can measure—clicks and timing—is not telling us what we want to know, and what we want to measure—customer success—is qualitative. This is, generally, true. Nevertheless, with a little creativity, we can still use web analytics to improve our performance.

Think about goals and conversions again. Typically we want the conversion rate rising. On an ecommerce site, the more people who click “Complete Purchase” the better. But what if we set up a goal for which we want the lowest possible ratio? Why would we do that?

Here’s one scenario. Each screen in our software has a context-sensitive link to a specific help page. So, we know how many people clicked help in each screen. We could configure our web analytics tool to consider each visit to this help page as a conversion, and for a conversion rate over, say, 20 percent to trigger a flag. And what does that flag tell us? Probably not that users love reading that specific help page. It may, though, be telling us that the application screen is confusing users. Based on this information, we can take a few concrete actions based on this information:

  • We can prioritize this help page and any documentation related to the task involved. We can provide more thorough editing and run some user tests to better streamline and clarify the content.
  • Or better yet, we can work with the product team to improve the user interface and workflow for the screen in which people are frequently requesting help. After all, the flag is telling us that users need disproportionate help with this screen.

We can also use conversion rates in the opposite way. For content that has an extremely low conversion rate, we can de-prioritize maintenance or get rid of it all together. Or we might be able to use the associated application screen as a model of clarity.

Next Steps

Interpreting data for a documentation site is not straightforward, and there’s no clear-cut way to measure our own success. However, as these examples show, we can use web analytics to help us set priorities and focus our efforts. We continue to explore more ways web analytics can help us improve our documentation. Here’s another example.

Path reports are useful for understanding what users are looking for. The example in Figure 1 shows that in the selected time period, a plurality (32 percent) of users came to the page we’re analyzing through the page indicated by the top blue line.

Hoeber_Figure1

Figure 1: Example of a Path Report

Why do users take this path? Is it a natural flow in the process? The organization of the TOC? A prominent link? What about the other entry pages? Are they part of this user task? Which pages do users view after this page? Does the most common path match what we consider the right flow of information? If not, how can we better guide users as intended?

As you can see, the data is just a starting point for our inquiry. We should use it to guide more qualitative analysis, with the goal of improving the information architecture.

Other examples of useful data that we continue to explore are

  • Geographic distribution of visitors. In the global environment we work in, using this data to help prioritize translation projects allows us to focus on the greatest need.
  • Browser and platform data. Like most organizations, we can’t realistically test our site on every combination of device, operating system, and browser. But we can make sure we validate our content for the systems most commonly used.
  • Search terms. What are users searching for? Do the common terms align with our own taxonomy? What does the language they use reveal about their expectations?

Web analytics provides great data. But the data is just the beginning of the process. We need to carefully consider what the data can tell us about our goals, then use data to ask the right questions, guide qualitative research, and arrive at actionable conclusions. Ultimately, by looking beyond available metrics, we can use web analytics to contribute to improved documentation and enhanced customer success. CIDMIconNewsletter

Hoeber_Mark

Mark Hoeber

IBM Corporation

mhoeber@us.ibm.com

Mark is the senior manager for information development in IBM’s Enterprise Marketing Management group. Mark’s team is responsible for all customer-facing documentation for the suite of products from the Unica and Coremetrics acquisitions. Mark has worked in technical communications for over 15 years and is keenly interested in marketing, analytics, and content strategy. He’s a contributing author of The Java Tutorial: A Short Course on the Basics and has presented to the Society for Technical Communications on Drupal. Mark holds a MBA from Babson College and completed the graduate program in software engineering at Harvard University Extension School.

 

We use cookies to monitor the traffic on this web site in order to provide the best experience possible. By continuing to use this site you are consenting to this practice. | Close