The Value of a Reciprocal Relationship Between Research and Practice

Home/Publications/CIDM eNews/Information Management News 05.13/The Value of a Reciprocal Relationship Between Research and Practice

Rebekka Andersen, University of California, Davis

This article is a follow-up to the “Results of the 2012 Best Practices Research Needs Survey” article in the April issue of the CIDM e-newsletter. Here, I discuss the value of a reciprocal relationship between research and practice and suggest that the time might be right to form a new habit of collaboration between industry and academia. Research approaches and design possibilities and strategies for overcoming empirical research challenges are also discussed.


The 2012 Best Practices Conference introduced the panel, “The Researchers, Educators, and Communicators Roundtable: Forming a New Habit of Collaboration.” The panel brought together researchers, educators, and communicators to discuss various challenges information development organizations are facing and how new research data could help managers better address those challenges.1

A key point of emphasis on the panel was the value of a reciprocal relationship between research and practice. In a reciprocal relationship, research questions are identified in practice and then researchers design studies to address those questions. Study results are then fed back into practice to improve practice, at which point new research questions are formulated (see Figure 1).

Figure 1: A Reciprocal Relationship Between Research and Practice (Source: Carolyn Rude)


Panel member Carolyn Rude of Virginia Tech argued that the field of technical communication has not done a good job conducting research that begins with questions grounded in practice and then used to improve practice. The gap matters, she said, because “Synergy strengthens both practice and the academy.”

Since Carolyn Miller’s seminal 1989 article, “What’s Practical about Technical Writing,” technical communication researchers have called for stronger research/practice connections and a more collaborative partnership between the academe and industry; for all that, these calls have largely gone unanswered. Many of these scholars, as well, have called for a research agenda centered on questions that grow out of practice; these questions they argue should be driving our empirical research, the results of which should be fed back into practice to improve practice. For example, in their 2004 article, “The State of Research in Technical Communication,” Ann Blakeslee and Rachel Spilka report the results of their survey of 20 prominent researchers in the field. Blakeslee and Spilka found universal agreement that researchers need to investigate more research problems that industry considers important and that this research should lead to guidelines and best practices for the field. In their article, Blakeslee and Spilka called for more available forums of exchange and collaboration between practitioners and academics. Five years have since passed with little evidence of change. To date, the field has struggled to develop and sustain a reciprocal relationship between research and practice.

This struggle is in part due to a strained relationship between academia and industry that has long been in the making. In fact, the first half of Barbara Mirel’s and Rachel Spilka’s 2001 anthology, Reshaping Technical Communication, is devoted to describing this problem and possible solutions to it. One reason for the strained relationship is that we in academia have not done a good job feeding the results of our research back into practice in ways that are accessible and useful. The primary venue for publishing research has been the academic journal article, which is targeted to and read almost exclusively by technical communication scholars themselves, making their research inaccessible to those who are best positioned to apply the research findings. Another reason for the strained relationship is that the field does not have good ways to involve practitioners in identifying research questions. Rude emphasized this point at Best Practices.

How can communication professionals and academics collaborate to make a reciprocal relationship between research and practice work? How can both groups move from discussions about the need for this to happen to actual action? To date, neither group has done a good job developing sustained, mutually beneficial relationships. And this is the key: a collaborative relationship must be mutually beneficial.

There is strong interest on both sides to make this relationship work. The CIDM has already shown tremendous support by emphasizing the value of research at the 2012 Best Practices Conference and by publishing four recent industry/academe research initiative articles in the CIDM e-newsletter:

JoAnn Hackos has also started a special LinkedIn subgroup to the CIDM called Academic/Industry Collaboration. See “Results of the 2012 Best Practices Research Needs Survey” in the April issue for more details on how you can participate.

For the past few years, the academic community, too, has increasingly emphasized the need to build stronger academia-industry partnerships to improve both education and research in technical communication. This topic was the central theme at recent annual meetings of the Council for Programs in Scientific and Technical Communication and the Society for Technical Communication Academic Special Interest Group.

Strong interest and momentum on both sides make this an opportune moment for those of us in industry and academia to work together to develop, implement, and evaluate concrete solutions for improving information development practice and education. The CIDM is offering the infrastructure to support research projects and the forum to support dialogue between researchers, educators, and professional communicators. This infrastructure and forum offer great promise for cultivating and maintaining a reciprocal relationship between research and practice.

Research Approaches and Designs

Different research approaches are available for examining the questions on which the information development community would most like to see new research projects focus. In this section I review advantages and limitations of quantitative and qualitative research approaches and describes research design possibilities for data collection. I also propose action research as a promising qualitative research model for developing and maintaining a reciprocal relationship between research and practice.

Qualitative and Quantitative Research

Both quantitative and qualitative approaches to research can produce data useful for problem solving. I provide a brief overview of each approach below:

  • Quantitative research. Researchers ask a specific, narrow question and collect numerical data from participants to answer the question. Researchers conduct statistical analysis of the data to discern trends and draw conclusions. Surveys and web analytics are the most common quantitative data collection methods used in technical communication.
  • Qualitative research. Researchers gather insight or knowledge about a topic in an attempt to understand perceptions, attitudes, and reasoning behind actions. Using qualitative methods, researchers are able to understand not only what is happening, but more importantly, why it is happening. Case studies, field studies, focus groups, and interviews are the most common qualitative data collection methods used in technical communication.

Whereas quantitative research can help us understand what people are doing and, to an extent, how they are doing it, qualitative research can help us understand what quantitative research cannot: that is, “why” people are doing what they are doing. For example, web analytics is a common quantitative approach to understanding customer behavior in online environments. In his talk, “Recognizing Customer Behavior with Web Analytics” at Best Practices 2012, Mark Hoeber of IBM discussed how web analytics can be useful for tracking online customer behavior, including how customers use online content. He cautioned, however, that web analytics data has limitations. Reports are just click stream data, he said; they just measure hits. While these reports can tell us how often customers are accessing information on a web site or what kinds of information they are accessing, they cannot tell us why customers do what they do or whether the information actually helps them achieve their goal. The reports also cannot tell us how people feel or how their feelings, behaviors, or practices change over time.

To answer “why” questions and more complex “how” questions, a qualitative approach to understanding customer behavior is necessary. Our organizations tend to be really good at getting quantitative data, and they have many tools with which to do this. They are not as good at getting the qualitative data they need to gain a more complete understanding of a phenomenon and to figure out what they need to do next. Hoeber calls for more qualitative research on the product team and the UX team to find out why people are doing what they are doing.

Results of the 2012 Best Practices research needs survey reveal that many research questions identified as important by survey respondents require a qualitative approach to collecting data. Some questions lend themselves well to a mixed quantitative/qualitative approach. The key is that quantitative and qualitative research methods can work together in answering questions.

There are different on-site and off-site research designs available to researchers. On-site research designs, which are almost always qualitative, can enrich understanding and improve practice in specific information development communities. These help us address what, how, and why questions. Off-site research designs, which can be qualitative and quantitative, can enrich understanding and improve practice in the larger information development community. These help us address what, how, and why questions, but they cannot always help us understand what people are really doing and how they are really doing it. Without observation, we can only know what people say they do. Table 2 offers an overview of different on-site and off-site research designs.

Table 2: On-Site and Off-Site Research Designs


The Potential of Action Research

Much qualitative research in technical communication has been descriptive, where researchers create thick descriptions of how and why people act as they do in different organizational contexts. But, as a number of researchers have argued, these descriptions don’t attempt to actively facilitate change in those organizational contexts.

I argue that action research may be our best bet for cultivating and maintaining a reciprocal relationship between research and practice. Researchers who conduct action research tend to position themselves as consultants in rather than observers of organizational contexts.

We have some good examples of action research, but not many. Gregory Cuppan and Stephen Bernhardt, in their 2011 article, “Missed Opportunities in the Review and Revision of Clinical Study Reports,” describe action research as “activity in which partnering clients engage with consultant researchers to identify problems, collect data, test solutions, and refine approaches through workplace training and feedback” (p. 134). Brent Faber, in his book, Community Action and Organizational Change: Image, Narrative, and Identity, adds that action research “begins with an interest in the problems of a community group, and its purpose is to help people understand and resolve the social problems that they face” (p. 189). Figure 2 illustrates the observer role versus consultant role difference between descriptive and action research.

Figure 2: Descriptive Versus Action Research


The research projects on which Cuppan and Bernhardt have collaborated illustrate one way that action research might work in information development environments. Cuppan, who is the owner of a consulting firm that works with pharmaceutical companies, and Bernhardt, who is an academic researcher and professor, have worked together for nearly two decades as consultants hired to study, analyze, and improve documentation practices in pharmaceutical companies. During these years, they have worked with over 50 pharmaceutical companies to evaluate, coordinate, and improve clinical study reports. The authors write:

“This consulting provides us an opportunity to see practices from inside organizations. In important ways, our consulting role limits our control over study design because we can do only what clients want us to do and are willing to pay for. But it also means that our clients are ‘in on the action,’ working collaboratively with us to define the problems they experience with document development and to improve outcomes. Our clients describe their frustrations, identify problems with reports, and confirm or reject our suggestions about what is going right or wrong and what ought to be done.” (p. 134)

Both descriptive and action research models have the critical features of qualitative research. But action research offers the most promise for developing and maintaining a reciprocal relationship between research and practice.

Overcoming Empirical Research Challenges

Setting up and carrying out an empirical study, particularly one that employs action research, poses numerous challenges to both the sponsoring organization and the academic researcher. The sponsoring organization will likely need to prepare a non-disclosure agreement and other measures to protect proprietary information and ensure research subject confidentiality. The organization will also need to allot the resources and time necessary for data collection; loss of productivity may be unavoidable.

Time and funding are two major challenges that academic researchers must negotiate to pursue empirical research. Academics have ongoing teaching and advising responsibilities, administrative and service obligations, and other research projects to which they must attend. Spending long periods of time collecting and analyzing data is a near impossible task without release time from at least some of these obligations. Release time can be awarded when internal or external funds are available to cover the cost of a course and/or service reduction. Funding for empirical research is also necessary to cover travel expenses, supporting resources, research assistants, or other requirements. Academic researchers would certainly benefit from greater opportunities to obtain funding that can buy them the time and resources they need to carry out empirical research projects.

One of the goals of the CIDM research initiative is to offer concrete solutions to some of the challenges of conducting the collaborative research projects that would most benefit the information development community. In their March CIDM e-newsletter article, Sid Benavente and Dave Clark describe crowdfunding as one potential solution to the question of how to fund research projects.

The next step in moving this initiative forward is to gain support from peers in industry and academia, to brainstorm additional research questions and refine the ones already identified, to formulate research project proposals that help us to answer those questions, and to build an infrastructure for supporting research projects. See “Results of the 2012 Best Practices Research Needs Survey” in the April e-newsletter for more details on how you can participate in the initiative. See “Results of the April 2013 Technical Communication Industry Research Needs Survey” for details on initial research projects that practitioners and academics have identified as top priorities.

1 Panel members included Pat Burrows of Global Information Management at EMC, Carolyn Rude of Virginia Tech, Dave Clark of the University of Wisconsin-Milwaukee, Bill Hart-Davidson of Michigan State University, and Rebekka Andersen of the University of California, Davis.

Carolyn Miller
“What’s Practical about Technical Writing.”
Technical Writing: Theory and Practice. 1989

Ann Blakeslee and Rachel Spilka
“The State of Research in Technical Communication”
Technical Communication Quarterly
January 2004

Barbara Mirel and Rachel Spilka
Reshaping Technical Communication: New Directions and Challenges for the 21st Century
2002, Mahway, NJ
ISBN: 0805835172

Gregory P. Cuppan and Stephen A. Bernhardt
“Missed Opportunities in the Review and Revision of Clinical Study Reports”
Journal of Business and Technical Communication
April, 2012

Brenton D. Faber
Community Action and Organizational Change: Image, Narrative, and Identity
2002, Carbondale, IL
Southern Illinois University Press
ISBN: 0809324369

Christopher Thacker and David Dayton
“Using Web 2.0 to Conduct Qualitative Research: A Conceptual Model”
Qualitative Research in Technical Communication