Results of the April 2013 Technical Communication Industry Research Needs Survey

Home/Publications/CIDM eNews/Information Management News 05.13/Results of the April 2013 Technical Communication Industry Research Needs Survey

Sid Benavente, Ph.D., Microsoft Corporation, Carolyn Rude, Ph.D., Virginia Tech, Bill Hart-Davidson, Ph.D., Michigan State University, & Rebekka Andersen, Ph.D., University of California, Davis

Thank you to all of you who participated in our recent Technical Communication Industry Research survey! The survey asked a common set of questions about research needs to both industry professionals and academics. What is clear is that while the priorities for these two groups can differ quite dramatically, the two groups’ missions are intertwined. We have some work to do to better understand the differences. But we are excited to note that we also have some clear shared priorities with which to begin engaging one another.

Conversations like the one at Best Practices 2012 in Monterey and, hopefully, those that follow from sharing the results of surveys like this one are a way to begin. We foresee productive collaborations between academics and industry professionals in various roles: employers and those charged to prepare employees, researchers, and practitioners, among others. There may well be valid explanations for different priorities. But the differences in research priorities need not be where we begin. We can start with where we agree. And by working together more often, we can perhaps learn more about how both groups’ compelling questions may reflect shared concerns.

The results of the survey are detailed below.

Who Participated?

At the 2012 CIDM Best Practices Conference in Monterey, conference participants identified numerous specific research questions. (See the appendix.) The survey has been important for expanding the number of people identifying questions and for establishing priorities.

A total of 261 participants responded to the survey, 73% of whom are practitioners in technical communication. Fifty-four respondents identified themselves as academics, and 190 identified themselves as practitioners.

Those selecting ‘other’ and who provided additional information identified themselves with these categories:

  • both academic and practitioner (4)
  • software vendor (3)
  • manager (3)
  • project manager (1)
  • consultant (1)
  • retiree (1)
  • academic outside of Technical Communication (1)
  • training manager (1)
  • student (1)

Most (78%) respondents were from the United States, but some global representatives participated:

 

What Research is Most Important to the Survey Participants?

In the survey, respondents could rank eight research categories from 1-8 with 1 being “most important” and 8 being “least important.” Respondents could rank only one topic at each level. The following tables show the research priorities of the two largest groups of respondents (industry professionals and academics) by totaling the number of responses of 1-3 (most important) for each topic and the number of responses of 6-8 (least important) for the topics.

The tables are read most easily by focusing on the fact that within each of the topic areas (the eight bolded columns), the darkest green at the top of the column indicates the highest number of rankings as “Most Important” whereas the darkest green at the bottom of the column indicates the highest number of rankings as “Least Important.” Conversely, the darkest red indicates the lowest number of rankings. (See the appendix for the full titles/descriptions of each topic area.)

 

How Do Research Priorities for Practitioners and Academics Align or Diverge?

Both groups express a need for research in user behavior—third priority for industry professionals, second for academics. Both groups rank processes and practices in the top half of their priorities. Neither group is particularly interested in research on tools.

Variations in the two groups may reflect their different missions. Academics prepare graduates for career positions. Thus, it is not surprising that training interests them to a much greater extent than it does industry professionals.

The greatest divergence is in the topics of content strategy and metrics—the top two issues for industry professionals but in the second tier of priorities for academics. These issues reflect the daily work and needs for decision making by industry professionals, but they affect academics less. This divergence does not mean that academics would not be interested in research in these areas. The survey results will help to make them aware of the need.

 

By looking at the relative importance we see the largest difference in opinion in the areas of content strategy and training, with the least divergence in the area of process.

 

What are the Field’s Priorities for Research?

In a reciprocal research relationship between practitioners and academics within technical communication and information design, research questions are identified by practitioners, and academics conduct research, sharing results to improve practice. Both parts of the field, as well as the field as a whole, benefit from this relationship.

In spite of divergent research priorities for practitioners and academics reflected in survey results, there does seem to be some consensus that areas of high priority include content strategy, user behavior, and processes and practices. These topics reflect changes in the field in the production and management of information as a result of cost and time pressures, new technologies, and leaner staffs. They also reflect a changing user who now uses a variety of devices for getting information and learning to use products. In a period of rapid change, industry professionals must invent on the fly. Systematic research could help to determine whether the adaptations and decisions have worked. Although academics are somewhat more interested in training and roles of technical communicators (reflecting their purpose in preparing graduates for their careers), researching content strategy and user behavior could provide useful information for their work in the classroom.

In short, the survey suggests that these topics might be priorities for the field’s research.

1. Content strategy
2. User behavior
3. Metrics/measurements
4. Process/practices

Next Steps

The first step in developing a robust research relationship between academics and industry professionals is to define the research needs. The survey has provided an excellent starting point. Its results need to be publicized, through CIDM and through the LinkedIn group, the second step.

The kinds of studies that provide answers about content strategy, user behavior, and processes and practices require funding, and a third step is to develop a fund to support studies that will enhance practice. Crowdfunding was suggested at the Best Practices conference and described in the March e-newsletterarticle. Details need to be worked out for defining the type of project(s) to fund, to raise funds, to call for proposals, to evaluate the proposals, and to establish the terms of funding.

Please join the Academic/Industry Collaboration subgroup on Linkedin to learn about potential joint research projects already being discussed and to contribute your research project ideas. An immediate goal for the subgroup might be to discuss potential studies within one of the top four categories of research.

From this discussion, the CIDM will invite research project proposals. As described in the April e-newsletterarticle, the CIDM will be creating a new research section on the CIDM website to announce research projects and solicit research proposals. We hope that details on how to coordinate the RFP process and the proposal submission process will be discussed in the LinkedIn subgroup, so please do join the conversation and share your ideas.

To learn more about the research initiative, see also The Value of a Reciprocal Relationship Between Research and Practice article in this issue.

Appendix

Survey Description of Research Areas

 

Compilation of Research Questions from Monterey Conference

Research Topic Associated Research Questions
User Behavior Studies
  • We write to people, not machines. We need more research on the human side. How are people using the information? How do they feel about it? What is useful and not useful?
  • What are users thinking when they read the content? Why are they accessing it? Is it useful? Why or why not? Why are people accessing content two or more times? Why are they leaving pages? Did they leave these pages more often because their questions were answered well or because the content caused them to give up?
  • What information do users actually use? What information is not needed? We need to get away from the idea that everything needs to be complete. We don’t have any studies that tell us what info can be cut. Lots of guesses. Do we need to explain the OK button four times?
  • How do end users react to problems and frustrations? What are their habits—pick up the phone, read documentation, Google for an answer? How are their habits changing? And how can we use this knowledge to intervene in and reduce or eliminate customer pain?
  • What output types (video, document, wizard) work best for given user groups? What type of information/content would be more appropriate in a portal than an e-book? When is video documentation preferred over traditional text-based documentation?
  • How do generation differences in users influence how they access and use information?
  • How does a story contribute to the effectiveness of technical content? Sometimes a coherent narrative gets lost in the pursuit of minimalism.
  • What makes users notice warning and caution statements? These are required by the FDA in medical device instructions, yet the only studies published are for tractor manuals. How can we write useful warnings and ensure that users notice them?
  • How effective are videos in user assistance and what percent of people prefer video for instructions and for learning?
  • How do software developers and users really interact with organizations using social media? Can this type of collaboration support improved quality of content? Can this type of collaboration scare or work with small pubs teams?
Process/Practice Studies Agile Development

  • What tenets of Agile are working well, and what tenets are wasteful? How can we best integrate information development into an Agile environment?
  • What is the difference in required amount of time a writer spends on writing content for a feature in a traditional waterfall development process vs. an Agile environment?
  • What changes do we need to make in the entire organization to succeed in producing quality documentation in an Agile environment?
  • How can content development processes be adapted to efficiently integrate with Agile development processes

Impact of Cultures/Industries

  • How do information development processes differ in different industries, such as software, hardware, medical, and regulated industries? How do these processes differ among industries with publicly available documentation vs. those with highly proprietary documentation with controlled access?
  • What are the variations in department practices and behaviors region by region and/or industry by industry? Are there variations in practices attributable to where a department sits in the organization (engineering, support, shared services)?
  • How do we promote and help establish the technical communication profession in new geographies, such as China and India? What is the state of the profession in those regions? What challenges do information development teams in new geographies experience?
  • What are the effects of different corporate cultures on information development processes?
  • How do changes in information delivery by large organizations impact strategic planning for small companies?
  • What changes in organizational behavior result when the organization is not necessarily the authoritative voice on its products anymore?
  • In what kinds of contexts has enterprise-wide adoption of DITA been effective? What factors contribute to adoption success?
  • How do changes in information delivery by large organizations impact strategic planning for small companies?

Best Practices

  • What are best practices for writing for shrinking mobile devices? What are best practices for developing on-product help for touch screen products?
  • What information needs to go in the product and what information needs to go in the documentation? What redundancies can be cut out? What information is most effective as embedded content and what information is most effective as documentation?
  • How can we best write for users with limited literacy? How can we best write for different age groups?
  • What are the best strategies for implementing scenario-based content?
  • What are best practices for authoring multiple versions of the same product concurrently in a content management system?
  • How do we best stream media as part of information development?
  • How can information development teams effectively support developer ecosystems (code samples, developer guides, levels of expertise)?
  • What quality assurance practices are most effective?
  • What strategies for getting documentation team members involved in software interface design (moving to the front of process from the end of process) have proven most effective?
Content Strategy Studies
  • What strategies have organizations adopted for implementing content management processes that provide optimal balance between facilitating reuse and accommodating the need to version content to support multiple concurrent releases?
  • What strategies have organizations adopted for implementing a continuous update model without drastically increasing localization cost and process complexity?
  • What are best practices for transitioning the translation review process in an XML/DITA environment?
  • What media are best for what kinds of user situations?
  • What will be the next generation content architecture using scenario-based content, OLH, and user experience (UI) design? What will be the most effective combination?
  • What strategies work best for building an information architecture for a large documentation set?
  • Where does content management (CM) breakdown and why? How do we best plan for complexity management—for managing users, authors, content managers, and CM solution architects? What impact do these management decisions have on usability and sustainability?
  • What are best practices for integrating content from companies, customers, and communities? For organizations just getting started, what are the best places to start? For example, do we start first by listening for unmet customer needs and/or determining what types of information are the most valuable to users? What activities should be outsourced? When does it make sense to outsource activities vs. keep them in-house?
Metrics/Measurement Studies
  • How do departments evaluate their own success? How do their companies evaluate the success of information development departments?
  • What metrics have proven useful for measuring optimal resource levels for content development teams?
  • Are there quantifiable differences in information quality as a result of Agile development?
  • How has the shift to collaborative authoring affected the quality of content?
  • How can we quantify the value added by technical content?
  • How can we best measure the productivity and morale of remote writers/teams vs onsite writers/teams?
  • How can we quantify the value of an editor? How can we best measure editing activities (style and usage guideline development, terminology management)?
  • How many topics can a writer effectively and accurately develop per day? How many words? What percentage of reuse can we measure?
  • What is the optimum ratio of product developers to information developers?
  • What metrics have proven useful for measuring content quality and accuracy?
  • What metrics can we use for identifying un-used information?
  • How can we measure content reuse in our organizations (chapters, topics/pages, sections, tables, lists, figures, paragraphs, in line elements)?
  • How can we measure the production impact of technology and tools, such as XML, DITA, a content management system, standardization?
  • Is there a good method for measuring the deflection of support calls?
Technical Communicator Roles Studies
  • How significant is the gap between technical writing graduates and entry-level content developer requirements? How do we best handle this gap?
  • What are the different roles of today’s technical communicators and what new job classifications best represent these roles? How can we get organizations such as Radford to adopt these classifications?
  • How can we best develop new job descriptions based on the different roles of today’s technical communicators (production specialist, DITA/XML architects, CMS administrator, content developer)? How can we ensure wide adoption of these descriptions?
  • How have different organizations redefined team roles and job titles? What factors influenced the creation of these new roles and titles?
  • What are best practices for staffing information development? Are ratios for staffing or resource planning reliable and/or valuable?
  • What does the work environment of today’s technical communicator look like? Are technical communicators generally happy in working in these environments? What is the turnover rate for technical communicators in an organization and what accounts for this rate?
  • What is the effect of having a well-defined, formal, information architect role?
  • How much time do content developers actually spend writing content?
Training Studies
  • What skills and knowledge are we teaching that are actually helping students get hired and that are actually useful on the job?
  • What are effective training strategies for transitioning to topic-based, task-based documentation?
  • How do technical communicators learn? Does a specific educational background determine learning behaviors/processes?
  • Can we apply a more scientific educational methodology to our field?
Value Proposition Studies
  • How can we best understand our value and articulate that value to stakeholders? What does our value proposition need to reflect?
  • How can we translate the value technical publication organizations bring to their organization into the return on investment language that will gain the attention of corporate decision makers?
Tool Comparison Studies
  • How do different content technologies compare (content management systems, translation management systems, authoring tools, and output production options)? What are the potential advantages and disadvantages of the different features and capabilities of these technologies?
  • What kinds of quantitative and qualitative data exist that could help organizations make decisions at different stages of the tool evaluation, implementation, and adoption process? For example, for an organization that needs to select a translation management system, what data is available on product options, requirements, impacts on staff and process, cost/benefit, balancing, and return on investment?
  • What are best practices for evaluating content technologies? What criteria tests and independent benchmarks are available?

We use cookies to monitor the traffic on this web site in order to provide the best experience possible. By continuing to use this site you are consenting to this practice. | Close