Home/Publications/Best Practices Newsletter/2013 – Best Practices Newsletter/Applying Simplified Technical English To Your Content: The HyperSTE Framework

CIDM

April 2013

 


Applying Simplified Technical English To Your Content: The HyperSTE Framework


CIDMIconNewsletter Brian Gajadhar PhD MSc , Etteplan | Tedopres

Abstract

Three steps are required to achieve STE compliant content: receive training, use dictionaries, and use checkers. However, authors often rely too heavily on STE-checkers and may forget what their main goal is: to write an unambiguous and simple procedure according to universal rules. Consequently, a text may be fully STE compliant but still hard to understand. At Etteplan | Tedopres we focus primarily on proper writing skills acquired through practical experience, such as through training and best practices. Our view is that the critical balance between the expertise of the author and the feedback from the STE-checker should result in high-quality content. Support for this view comes from three separate empirical studies, the findings of which are explained with the HyperSTE Framework. Adapting to this framework will make written content more clear, more STE compliant, and will make the time consuming and applying STE more efficient.

Introduction

It is interesting to review the different responses you receive when asking someone how they would change a flat tire, particularly since the procedure has not changed in decades. The diversity results from the diverse ways that people communicate (Knapp & Daly, 2002). Some can be very clinical in their approach, while others are more enthusiastic and may use hand gestures and facial expressions to make the interaction more efficient (Argyle, 1988). However, when these people are asked to write down the typical tasks that have to be performed to change a tire, the non-verbal cues of face-to-face interaction cannot be used. Without the ability to use hand gestures, eye contact, and facial expressions, interaction may be less smooth (Mehrabian & Ferris, 1967; Short, Williams & Christie, 1976). The communication quality then depends heavily on the words chosen by communicators and the structure of their texts. The different responses may be derived from an individual’s intellect, experience, gender, and personality which may affect the way he or she describe a procedure (see e.g., Gefen & Straub, 1997). In the end, we would receive many procedures that describe the same task but are different in length and word choice.

Language has great power to explain how things work, to instruct, and to warn. For technical products and systems, documentation should enable appropriate use and prevent injuries. Unclear and ambiguous technical documentation can have life-threatening effects, which is why it is crucial that the language used is clear and easy to understand. It is undesirable to receive different explanations of how to perform a single task (for instance, change a flat tire); the more discrepancies between explanations, the higher the risk of misinterpretation. Moreover, technical writers tend to use special vocabularies (jargon), personal styles, and complicated grammatical constructions that may make texts difficult to understand by both non-technical and technical audiences (Shih, 2010). Considering these facts, the technical information community has needed a method to create uniform and simple content in a standard way (Dekker & Wijma, 2004; Boeing, 2012). Hence, writers must follow rules for writing, including the use of short, simple sentences conveying accurate information. Although there have been a few initiatives for controlled languages (Caterpillar Technical English or Plain Language), Simplified Technical English (STE) is the international specification (ASD, 2013) used for controlling language in technical publications1.

STE is a controlled language with a limited set of grammar rules, style, and vocabulary, based on the principle that one word has only one meaning. It was initially designed to create clear and understandable technical English in the aerospace and military sectors, particularly for non-native speakers. Research has shown that the use of STE yields significant improvements in the quality of technical information while reducing its length (Disborg, 2007) and increasing the ease of translation (Van der Eijk, 1998). For instance, Shubert and colleagues (1995) revealed significantly better task performance for both native and non-native speakers of English when reading an STE version of a manual compared to a non-STE version. Spyridakis and colleagues (1997), who investigated the quality and ease of use of translations from STE into other languages, demonstrated that STE can be accurately translated. They revealed that translations from STE into other languages were significantly better in terms of accuracy, style match, and comprehensibility and contained fewer mistranslations than a non-STE document. These findings explain why technical content should be written in STE, as it already is in many organizations in the software, telecommunications, medical, automotive, and semiconductor sectors.

Applying STE

To successfully apply STE in each writing process, at least three steps are required before getting started (Braster, 2009). First, technical writers have to be trained how to standardize the content of their documents (see also Gajadhar, 2011). They must follow the STE writing rules and use a restricted vocabulary that is specifically standardized for a company’s technical publications. Specific training with hands-on exercises focusing on workarounds for common problems should afford information on how to apply STE (Chiarello, 2012). Restrictions in allowed vocabulary are derived from dictionaries that writers must conform to. A client-specific dictionary is built on the principle that one word has one meaning and contains approved words and unapproved synonyms. The approved terms are only those that can be identified as the terms with the least ambiguous meaning and most understandable by the audience. Since the interplay of complex rules and dictionaries enhances the risk of overlooking errors, checker software is necessary to help writers check text for STE compliance. Checkers should contain the ASD and client-specific dictionaries as well as algorithms that ensure compliance with the STE rules. Such a tool may perform many of the mechanical aspects of checking, facilitate quality assurance of information, and increase satisfaction in the workflow (Disborg, 2007). In the next section, the most important checkers available in the field of technical documentation are discussed.

STE-Checkers

The most common checker is probably the content checker that Microsoft Office provides for spelling and grammar in Word, Outlook, and PowerPoint (Microsoft, 2012). This product has a default vocabulary the user can customize by adding unknown words to the dictionary. The choice of dictionary, what to include in the checking, and the rules related to grammar and spelling can all be set by the user. A more sophisticated checker tool provided by Congree also includes simplified English and related technical writing rules (Congree, 2012). This checker has two different sets of rules based on checking documents in Simplified English2 and on the style rules intended for technical manuals. In addition to selecting the set of rules, Congree allows the user to enable or disable the checking for errors function (spelling, grammar, abbreviation, and style). Both checkers provide a dialog box during the check, informing the users about the error and allowing them to choose if, and how, they want to correct it.

Since the dictionary used in combination with STE-rules is crucial, the above-mentioned checkers seem less suitable for Simplified Technical English. Therefore, companies have developed specific checkers intended for technical documentation. For instance, Acrolinx GmbH has released a checker that seems suitable for STE (Acrolinx, 2012). Unlike the previously mentioned checkers, it displays feedback in the form of results in order to evaluate the checking. It has a report function, which is useful for documenting the detected issues or sharing them with other users. It is possible to validate terms, make changes, and create links between preferred and deprecated terms. A similar, but more strict STE-checker has been developed by Boeing (2012); the Boeing Simplified English Checker (BSEC) primarily focuses on the aerospace industry. Etteplan | Tedopres developed a checker (HyperSTE) specifically for use in the technical content creation process for both aerospace and non-aerospace industry (Etteplan | Tedopres, 2012). This checker makes use of three dictionaries, all STE rules, provides detailed checking feedback, and provides the possibility to generate reports based on the checking results. Although the checker has been developed from a standardized specification, it allows company-specific dictionaries, configuration of the set of rules, and a strict as well as more tolerant version of the checker. HyperSTE is primarily designed as an assisting tool for technical authors and demands the author’s focus on its proper use in the writing process.

The Philosophy of Etteplan | Tedopres

As indicated above, a checker is merely one of the required instruments to start applying Simplified Technical English. Of course, the checker should make no mistakes and should enhance the quality of the content. Furthermore, ease of use becomes a major point in the evaluation of the product (Dumas & Redish, 1999). The user interface, navigation, and look-and-feel, must be designed to easily cover all the desired tasks without any constraints (Constantine, Biddle & Noble, 2003). How the checker is adopted in the writing process is as important as the rules and dictionaries or the usability of the checker. One does not merely provide a checker; the focus is first on how to achieve simple and understandable text by using knowledge achieved through practical experience (training and best practices). The checker “is seen as an additional tool that may never lead the writing process; the proper way to use checkers in the process is a skill of its own.”

Aim of the Study

Only a few of the available checkers in the field of technical documentation are suitable for use with STE. Although they may be very good at applying STE rules, these checkers are still inefficient in detecting whether the content makes sense. There should be a balance between applying STE rules and the common sense of a trained and experienced author; this point seems to receive little focus in many companies that claim3 to provide STE-checkers. Authors may often rely too heavily on STE-checkers (Fonseca, 2006) and forget what their main goal is: to write an unambiguous and simple procedure according to universal rules. Since checkers provide an STE compliance rating, this number may become a writer’s main focus, leading to a text that is fully STE compliant but still hard to understand. The focus must be on proper writing skills acquired through practical experience, training, and best practices. Applying STE is part of the whole process, where the STE-checker is merely an additional tool. The use of this tool—and its STE compliance ratings—may never become a writer’s main focus. It is the critical balance between the expertise of the author and the feedback from the STE-checker that should result in high-quality content. Support for this view has—to the best of our knowledge—been underrepresented in the academic literature on developing technical documentation. Therefore, we aim to provide evidence for this view below in three separate studies which used a state-of-the-art checker, experienced authors, and a well-known sample text.

Methods & Results

Study 1: Usability Tests

Method

To analyze the current commercially available STE-checkers, a survey was conducted and interviews done with technical authors who use STE (Nsur=24, 9 female; Nint=3, all male). The Questionnaire for User Interface Satisfaction (QUIS; Lund, 2001) was used to explore the usability of the most common checkers. Typical statements such as “It is easy to learn to use the checker” and “I am satisfied with the checker” were included4. Reliabilities of all scales were excellent (.88< =<.96). The interview results were used by the experts to explain the quantitative data from the survey.

Results

Results of the QUIS described the perceived usability of the most common checkers available. For usefulness, ease of use, and ease of learning, the rated score was above average; for satisfaction the score was below average. See Figure 1 for the results.

Braster_Figure1

The scores revealed that the perceived usability of checkers is acceptable in terms of usefulness, ease of use, and ease of learning; however, the general satisfaction score was significantly lower. To understand this deviation in the satisfaction levels, interviews were held with participants who provided arguments for their scores, such as:

“It is very frustrating to find out after eight hours that your hard work has not significantly increased the compliance rating.”

“Is it because of regulations we have to write according to these rules, or do people indeed understand it better? You know, sometimes I make a sentence STE compliant although I’m sure that the original was much clearer to me.”

The researchers collected all arguments and discussed them in a focus group with other experts. They concluded that many authors lack sufficient information about STE best practices. They seem not to understand how to achieve a balance between achieving “STE compliance” and a text that makes “common sense.” To understand importance of this balance, two more studies were conducted on STE compliance and the comprehensiveness of text.

Study 2: STE Compliance

Method

To study the important balance between using a STE-checker (STE rules and dictionaries) and practicing common sense (acquired through training and experience), two additional experiments were conducted. Chapters from the Bike Example (Haslam & Schaefer, 2012) were chosen as test objects in both experiments. The document is a well-known example to explain the S1000D specification5 in technical documentation. It was written by the Chair of the S1000D Bike Sample Working Group Members, a member of the S1000D Steering Committee. Since 2002, the document has been updated by experienced and trained authors; for the current experiments the latest version was used The original chapters (Common Sense) were rewritten by applying the strict (STE) rules; all rules were applied blindly to achieve the highest compliance rate with an STE-checker. In addition, the original chapters were rewritten, using the checker, with the focus on writing comprehensive text (STE + Common Sense)6. This objective was achieved by choosing a skilled, experienced, and well-trained author (34 years old; female) with expertise in the field of STE who rewrote the original text. An example of the original and the rewritten text is given below:

Common Sense (35 words):

Make sure that you do not lose the bearings from the hub. Be prepared to catch the bearings if they fall out. Pull the axle out from the other side as shown in Fig 1.

STE (32 words):

Make sure that you do not lose the bearings from the hub. Prepare to catch the bearings if they fall out. Pull the axle out from the other side see Fig 1.

STE + Common Sense (28 words):

Make sure that you keep the bearings from the hub. Catch the bearings if they fall out. Pull the axle out from the other side. See Fig 1.

For this experiment, the STE Compliance Rating was documented for all chapters (Common Sense, STE, STE + Common Sense) using the HyperSTE reporting feature. The STE compliance rating results from a formula that is based on the errors found, weighting factors (importance of errors), and the number of words within the document (Etteplan | Tedopres, 2012).

Results

Linear Mixed Model Analyses (analyses of variance) were performed on the STE compliance ratings of the original chapters and the chapters manipulated with Writing Method (Common Sense, STE, STE + Common Sense) as a between-groups factor. The analyses showed significant differences (F(2,30) = 22.65; p<.001) in STE Compliance between all Writing Methods. Contrast analyses showed that ratings differed significantly between all methods (p=.01); we may therefore conclude that the manipulations were successful. Chapters that were rewritten according to the strict STE rules showed an increase in the STE Compliance Rating. The same holds true for the method where the author focused on the balance between common sense and the STE rules. However, compared to the strict method, the increase is smaller. See figure 2 for the results.

Braster_Figure2

Study 3: Comprehensibility

Method

In the third experiment, participants (N=20) were asked to rate the chapters for comprehensibility. None of the participants knew the exact purpose of the experiment; 20 percent had some knowledge of STE and 50 percent were technical authors. They received three randomly chosen chapters; for each chapter they received the original and the ones rewritten using the STE and the STE +Common Sense methods (nine chapters in total). Participants were asked to evaluate (from 1 = very easy, to 7 = very hard) the chapters for 5 minutes each and answer two questions that measured comprehensibility: “The text is easy to understand” and “The text is easy to read”

( =.92). They were unaware of the methods that were used to create the text in the chapters.

Results

To demonstrate that the quality of written content is not only based on compliance with STE regulations but also on the comprehension of text, a second study was performed on the example chapters. Participants indicated the overall comprehensibility of the content retrieved from documents drafted using the three different writing methods. Linear Mixed Model Analyses (Repeated Measures) were performed on the subjective data, using Writing Method as a between-groups factor and Participant Number as a random factor. The analyses showed significant differences in Comprehensibility among all Methods (F(2,81.7) = 46.49; p<.001). Chapters that are fully rewritten according to the strict STE rules significantly decrease the comprehensibility of the text as compared to the original text. By contrast, compared to the original text, chapters to which STE is applied, as well as common sense, show a significant increase in comprehensibility. See Figure 3 on page 50 for the results.

Braster_Figure3

Discussion

Diverse communication styles including intellect, experience, and personality (Knapp & Daly, 2002; Gefen & Straub, 1997) may be the primary reason to standardize the creation of technical information. Technical content has to be understood by different target audiences in an unequivocal manner (see Gajadhar, 2012; Shubert et al., 1995; Jansen & Balijon, 2002). Simplified Technical English (STE) is a language with a restrictive set of grammar, style, and vocabulary that can accomplish this goal (Chiarello, 2012; Dekker & Wijma, 2004). Some companies have developed a content checker to successfully apply STE to each writing process. However, most companies mainly focus on the software rather than the writing process as a whole. Applying STE is a part of the whole process, wherein the use of the STE-checker is merely an additional tool. The use of this tool—and its STE compliance ratings—should never become a writer’s main focus. It is the critical balance between the expertise of the author and the feedback from the STE-checker that should result in high-quality content. Since support for this view has been underrepresented in literature, our aim in this article has been to provide scientific evidence with three empirical studies.

Explaining the Results

From the usability study, it was concluded that the satisfaction from checkers as perceived by the authors is not acceptable, because many authors lack sufficient information on the use of STE (best practices). They seemed unaware of the balance between achieving STE compliance and a text that makes common sense. To show the importance of this balance, a performance test was conducted where chapters of the Bike Example (Haslam & Schaefer, 2012) were rewritten according to methods that differed in the way they affect STE compliance. Chapters fully rewritten by applying strict STE rules increased in STE compliance, yet decreased in perceived comprehensibility as compared to the original. In the case where the author focused on the balance between common sense and the STE rules, the increase in STE compliance was smaller1, but the comprehensibility significantly increased. These results demonstrate that the quality of written content is not only based on compliance with STE regulations but also on the comprehensibility of text. This balance can only be achieved by well-trained authors using STE-checkers, ideally in combination with authors’ best practice experiences acquired from earlier cases, training, or consultation (see also Chiarello, 2012).

The HyperSTE Framework

The critical balance between applying STE and writing comprehensive content is best achieved by experienced and well-trained authors using high quality STE-checkers. From the checker users’ side, the need for training and best practices should be acknowledged; from the checker providers’ side, training and consulting should be provided. Unfortunately, on both sides this awareness often seems absent, causing inappropriate use of STE which results in STE compliance but incomprehensible content. Etteplan | Tedopres offers specific training and consult on the meaning and use of STE combined with HyperSTE, using the HyperSTE Framework (see figure 4).

Braster_Figure4

The framework demonstrates the ideal workflow, where the adoption of the content checker is part of the whole writing process. Hence, it illustrates two key services that providers of STE-checkers should offer their customers: Training and Best Practices and IT and Linguistic Support.

Training and Best Practices

In this article, we have shown that offering just a checker is not enough to create STE compliant content. Knowledge of STE writing rules and “how and why” to use the STE-checker are critical to achieving high quality documentation. Theory should be followed by practice, which is why—besides training—discussing best practices must be part of instruction given to customers. Authors must be provided with hands-on experience about the important balance between STE and common sense. After the training, authors should understand that checkers and their compliance ratings are an addition. Their main focus must be on creating understandable and simple text. As a result, their output will be more clear, more STE compliant, and therefore more efficient.

Linguistic and IT Support

Although one would expect so, not all companies that provide STE-checkers offer full support. For installations, bugs, and other technical issues, most companies may provide IT support; however, linguistic support is rare. Linguistic support includes building and maintaining company-specific dictionaries. Building dictionaries is a time-consuming speciality that must be undertaken by linguistic experts. Experts extract technical terms from the original documentation set and subject these to linguistic analysis (removing ambiguous terms) and statistical analysis (identifying terms that are most used). Depending on the size of the company and its documentation, a company-specific dictionary will usually contain between 2000-3000 technical terms to supplement the core STE dictionary. Such dictionaries are dynamic files that should be periodically maintained and continuously controlled by one or two terminology managers (preferably experienced linguistic experts). Since such expertise is absent within many companies it becomes important to either have linguists on the staff or use consultants.

Conclusion

This article discussed the critical balance between applying STE and writing comprehensible content. This balance is best achieved by experienced and well-trained authors using high quality STE-checkers. The main focus of authors should be achieving simple and understandable text by using the knowledge achieved by specific training and consulting. The checker is seen as an additional tool that should never, alone, control the writing process. CIDMIconNewsletter

Acknowledgments

The studies conducted in this article are the result of a pleasant and successful collaboration with Nina de la Motte as part of her internship at Etteplan | Tedopres. Furthermore, the author appreciates the support of Peter Sistermans (Manager Translation Department Intellectual Property and Standards), Mechteld Groot Bruinderink (Graphic Designer), and Valerie Verburg (Technical Author) for their valuable input.

About the Author:

BrianGajadhar

Brian Gajadhar PhD MSc
Etteplan | Tedopres
b.gajadhar@tedopres.com

Brian Gajadhar is the Research & Development program manager of Etteplan | Tedopres, a leader in technical information services. Brian holds a PhD in Human Technology and Interaction from the University of Eindhoven and has a background in physics. He has been with Etteplan | Tedopres for 2 years, during which he has managed the research and development of Etteplan | Tedopres’ leading solutions HyperSTE, HyperSTI, HyperDOC, HyperCenter, HyperSIS and HyperParts. Under his supervision the Tedopres Model and the Etteplan Technical Information Framework have been developed and described.

1. Technical documentation is often written in English; from this language, it may be translated into other languages. Although some writers author directly in their native languages, the standard dictates that the leading document should be written in English.

2. Simplified English is now officially known under its trademarked name as Simplified Technical English (STE).

3. The use of manipulated versions of the ASD-STE100 specification, partial use of the specification or deviations from its writing rules and vocabulary will diminish accuracy of STE and create confusion among its users (Chiarello, 2011).

4. For participants who stated to use a certain checker—for instance Acrochec—, we filled in the name of Acrocheck for “the checker” in the questionnaire.

5. STE is highly recommended by the S1000D specification.

6. The author used an acceptable 180 sentences per hour to review the original text. All sentences were checked for STE compliance, but not all were rewritten.

7. The Comprehensibility rating was transformed from 1 – 7 from the scale provided, to 0-100% for the comparison with STE Compliance and then reversed (i.e. 1 = 100%; 7 = 0%).

8. The increase in compliance ratings is remarkable, since it may be assumed that S1000D example texts are written according to STE. This proves that by using a checker as an additional tool, the quality of text can be significantly improved.

References:

Acrolinx
Authoring Support
Retrieved on July 9, 2012
from: http://www.acrolinx.com/authoring-support_en.html

Michael Argyle (1988).
Bodily Communication (2nd ed.)
Madison, WI
International Universities Press
ISBN: 0416381405

ASD
Specification ASD-STE100: International specification for the preparation of maintenance documentation in a controlled language
Issue 5, 2013

Boeing
Boeing Research & Technology
Retrieved on May 7, 2012
from: http://www.boeing.com/phantom/sechecker.

Berry Braster
“Controlled Language in Technical Writing”
Multilingual
January/February 2009

Orlando Chiarello
Service information letter/Position paper: To all users of ASD Simplified technical English, ASD-STE100
Retrieved on June 29, 2012

Congree
Linguistics
Retrieved on July 9, 2012
from: http://www.congree.com/en/congree-linguistics.aspx

Larry Constantine, Robert Biddle and James Noble
Usage-centred design and software engineering: Models for integration. Bridging the gaps between software engineering and human-computer interaction
ICSE’03 International Conference on Software Engineering
Portland, Oregon, page 106-113

Dekker, J. & Wijma, F.
The new language in International Business: Simplified Technical English 2nd edition. Tedopres International B.V., Tilburg
2004

Disborg, K. (2007)
Advantages with Simplified Technical English—to be used in technical documentation by Swedish export companies
Master thesis: Linkoping University, department of Computer and Information Science, Sweden.

Joseph Dumas and Janice Redish
A Practical Guide to Usability Testing, Revised Edition
1999, Exeter, UK
Intellect Ltd
ISBN: 1841500208

Van der Eijk, P
Controlled Languages in Technical Documentation
In Elsnews: The Newsletter of the European Network Language and Speech
1998

Dave Fonseca
How Simple is Simplified Technical English
STC Intercom, 53 (2), 20-22, 2006
Retrieved on July 1, 2012
from: http://ocstc.org/attachments/048_20062_20-22.pdf

Brian Gajadhar
Efficiently Creating High-quality Technical Publications: The Tedopres Model
Whitepaper
2012

David Gefen and Detmar Straub
Gender Differences in the Perception and Use of E-mail: An Extension to the Technology Acceptance Model
MIS Quarterly
May, 1997

Haslam, P. & Schaefer, U.
The S1000D Bike Example
Retrieved from FrameMaker 10, Adobe, 2012

Mark Knapp and John Daly
Handbook of Interpersonal Communication
2002, Thousand Oaks, CA
Sage
ISBN: 0761921605

Arnold Lund
Measuring Usability with the USE Questionnaire
STC Usability SIG Newsletter
February, 2001

Albert Mehrabian and Susan Ferris
Inference of Attitudes from Nonverbal Communication in Two Channels
Journal of Consulting Psychology
June, 1967
References Continued

Microsoft
Choose How Spell Check and Grammar Check Work
Retrieved on July 9, 2012
from: http://office.microsoft.com/en-us/powerpoint-help/choose-how-spell-check-andgrammar-check-work-HP010354280.aspx

Eric Nyberg, Teruko Mitamura, and Willem-Olaf Huijsen (2003)
Controlled Language for Authoring and Translation
H. Somers (ed.), Computers and Translation: A translator’s guide
H. John Benjamins Publishing Company, Amsterdam

Chung-ling Shih
Shift in Controlled English Norms for Different Purposes and for Different Machine Translation Systems
Journal of Language & Translation
September, 2010

John Short, Ederyn Williams, and Bruce Christie (1976).
The Social Psychology of Telecommunications
1976, New York
John Wiley & Sons
ISBN: 0471015814

Serena Shubert, Jan Spyridakis, Heather Holmback, and Mary Coney
Testing the Comprehensibility of Simplified English: An Analysis of Airplane Procedure Documents
IPCC’ 95 Proceedings; Smooth Sailing in the Future

Jan Spyridakis, Heather Holmback, and Serena Shubert
Measuring the Translatability of Simplified English in Procedural Documents.
IEEE Transactions on Professional Communication
March, 1997

Tedopres HyperSTE 4.1.
Retrieved on July 2, 2012
from : http://www.simplifiedenglish.net/

Jansen, C., & Balijon, S. (2002). How do people use instruction guides? Confirming and disconfirming patterns of use. Document Design, 3, 195-204.

 

We use cookies to monitor the traffic on this web site in order to provide the best experience possible. By continuing to use this site you are consenting to this practice. | Close