- About the localization process-what are the various roles and how does content flow
- About use cases and constraints-what is incremental localization and what is the relationship of localization to the product release lifecycle
- About release management and localization-what role can branching play in localization
- About localization tools-how do CCMS, TMS, and LSP relate in the localization process
- About recent developments in automation-how can the localization process be accelerated through automation and what are the wins and pitfalls of automation
February 12, 2020 Products are sold into markets that cross both country and language boundaries, so product documentation must be localized. Localizing documentation is made easier with DITA, a capable Component Content Management System (CCMS), and a Translation Management System (TMS). Even with these systems in place, you must always consider best practices and pitfalls. Jim will begin with an overview of the localization process, of how content flows across systems and organizations. In this context, we will discuss techniques and also wins and pitfalls in automating and streamlining the localization process. In this session, attendees will learn:
Recorded on June 15, 2022The Future of Documents is shaped by a fundamental change in how information is exchanged. This future sees a shift from traditional 'e-paper', formatted and optimized for reading by humans, towards semantically tagged information - exchanged in an open digital format. In this talk, we will share examples from industries that have a history in structured content - such as publishing and technical documentation - as well as from industries that are ‘newer to the game’, such as pharma and financial services.
- The Future of documents, - structured content and semantics from niche to mainstream
- Define Structured Content. What is it, and why is it so fundamentally different from formatted documents?
- What drives organizations to move from traditional authoring towards structured content authoring
- How can industries learn from each other? – Discussing benefits and features.
- Content automation
- Documents as data
- Component based authoring
- Future of documents
September 9, 2020 This is a presentation that provides an intersection between content and design. It is a high-level practical guide to analyzing your content, deciding what you want to do with it, developing style naming conventions, and developing editorial style guides, design style guides, and style templates. Once you get to that stage, you can then consider how content management systems, automation, and xml might fit into your publishing processes. Presented by: Marie Gollentz is a Senior Solutions Consultant focusing on the European market. Prior to joining Typefi, she held a number of positions in the publishing industry in London, including at the publisher of Research Fortnight and the London School of Business and Finance. Marie holds a Masters degree in European Political Sciences from the Autonomous University of Barcelona and a Bachelor’s degree in Political Sciences from Sciences Po Strasbourg. She is trilingual in English, French and Spanish.
Date: June 12, 2019 Abstract: Ever since DITA was first used, the greatest obstacle to adoption has been usability. Whilst technical authors have been prepared to learn the intricacies of DITA, others in the content creation lifecycle have been more obstinate. Bluestream together with Simply XML will show how the extensible XDocs CCMS can be integrated with clear browser and desktop authoring environments to allow subject matter experts to create and collaborate on content without ever having to see a tag. Presented by: Rik Page is Sales and Marketing Director at Bluestream Software and has been working with both component content and document management solutions since 2001. During this time he has worked with custom DTDs and Schemas, S1000D, iSpec2200 and DITA. His experience covers a wide range of industries including education, banking and finance, manufacturing and healthcare. A keen advocate of technology and innovation Rik has taken part in multiple consultancy projects and helped formulate solutions all over the world. Over more than two decades, Doug Gorman commercialized the structured writing methodology known as Information Mapping. In the process he realized that XML(DITA) could be used beyond Technical Publication Departments to modernize the enterprise content supply chain. The authoring and repository tools, however, would need to be easy to use. XML would need to be utilized but hidden “under the covers.” Simply XML’s authoring tool, Content Mapper, is a Word Plug-in where the author sees a Word User Interface and the CMS uses DITA XML. Content Mapper is designed for organizations that understand the value of XML as the content architecture, but with many of the 1 Billion plus non-technical authors who use MS Word on a PC. Content Mapper is integrated with Bluestream’s XDocs CCMS to help organizations achieve better content for readers with the important efficiencies of content reuse, single source publishing, and other process improvements.
Recorded: June 9, 2022Terminology only works effectively if the organization manages to agree on consistent concepts and terms. But to achieve a company-wide understanding and agreement a well thought out process and regular interaction are needed. While some organizations prefer a systematic process, others react to ad hoc requests, some are happy with a combination of both. Therefore, we
- look at the advantages and disadvantages of both approaches and
- share how to set them up in practice
Recorded: April 27, 2022Presented by Klaus Fleischmann OK, so we have all understood that terminology is important for content, AI, search engines, consistent naming, etc. But what do you need to do
- to launch a professional and scalable terminology process,
- to convince your boss and your peers that your company needs it and
- to get this off the ground quickly and efficiently with the help of modern terminology software?
Recorded on September 8, 2022
Does this sound familiar: there is so much data in your organization, and it is not always clear which sources are up to date and really relevant to business decisions? Terminology can act as the single source of truth, breaking up data silos and providing the same information to every employee. But in order to achieve that, terminology needs to get close to the users and find its way into their systems.
We share how
- to implement the "single source of truth” in terminology work
- to connect your systems, e.g., ERP, CMS, PIM, CAT tools, etc.,
- to distribute the information
- to make this information usable for all
Join us for a tour that starts with your relevant data sources and ends with a clear and concise terminology process as a way of making sense and use of this data.Presented by Christian Lang Christian Lang, Technical Consultant, has a wide range of experience and interests in the language field, proven by his degree in Japanese studies and translation. He first became involved with terminology management as a freelance translator for the European Patent Office. Since then, it has become one of his hobbies, as has research in the field of NLP on topics such as machine translation, automatic term extraction, and concept maps.
March 6, 2019 In a world where Google sets the gold standard for providing relevant search results, how can technical documentation teams create findability that is always spot-on for every single reader? Relevance of search is determined by a combination of content metadata, contextual knowledge about the user, and the search query itself. The challenge therefore resides in collecting and analyzing these elements, and applying them systematically to every search query to create truly personalized search results. Presented by: Fabrice Lacroix is a known Web pioneer and the founder of Antidot, the company that puts enterprise content to work. As an entrepreneur, he has been working for 25 years on the development of the Internet and of the Web through several major companies.
February 15, 2022 Just putting your technical documentation on your webpage as a PDF for download will not make you a content hero. PDF is a given – a fundamental requirement. However, both business leaders and end customers expect more than just a documentation portal or a well-designed PDF. They want contextually relevant, personalized, consistent, conversational, and scalable content experiences. But how to deliver on this expectation? In this session, Stefan Gentz, Senior Worldwide Evangelist for Technical Communication at Adobe, will share some insights from the recent Adobe-commissioned Forrester study. Forrester surveyed 450 decision-makers on Content Experience Management to understand how global brands deliver relevant and contextual experiences across touchpoints. He will explore the design of positive customer journeys from marketing to technical support and self-service and back to sales – experiences that provide 360° content experience for your customers and handhold them across their whole content journey. Presented by: Stefan’s mission is to inspire enterprises and technical writers around the world and show how to create compelling technical communication content with the Adobe Technical Communication tools. He is also a certified Quality Management Professional (TÜV), ISO 9001 / EN 15038 auditor, ISO 31000 Risk Management expert, and Six Sigma Champion. As a sought-after keynote speaker and moderator at conferences around the world, he travels around the globe half of the year. Besides that, he has been the European Ambassador for the Globalization and Localization Association (GALA) for many years, a member of the tekom Conference Advisory Board for several years, and is now a member of the tekom iiRDS working group for Intelligent Information and member of the OASIS DITA Adoption Committee. In 2016, Stefan Gentz was awarded by MindTouch as one of the Top 25 Leading Content Strategist Influencers in the world and as one of the Top 25 Content Experience Influencers in the world in 2017. Stefan Gentz on LinkedIn Stefan Gentz on XING @stefangentz on Twitter www.adobe.com
November 11, 2021 Content authors need to consider many factors while converting their existing content to structured content, including analyzing the content at its source. This exercise is interesting yet challenging, vexing, but rewarding. But what are those factors you need to think about for this conversion? For starters, you need to consider:
- Character styles and how they are assigned (Use the style named Emphasis? Press Ctrl+I? Use a toolbar icon? Is there a difference?)
- Paragraph styles and how they are assigned (Did you skip bullet 1 and go right to bullet 2 because it looks better and, if you did, what does it mean?)
- Images and how to import them (Did you copy/paste or use the import setting? If so, how did you import it? As a link to the source?)
- Links between files or the web (Did it just get typed in, or was this inserted using a linking tool? How will it convert?)
- Page layout, tables, design, variables, equations, and so on
Recorded on November 3, 2022For over fifteen years, teams have tried to justify their need for a move to structured content with the same old arguments. In today’s ever-changing landscape, these old arguments are no longer sufficient and, in many cases, not even relevant. In this webinar, we will be looking at other drivers for the move to structure with real-world examples of what can be achieved. Presented by Rik Page, Bluestream Rik has worked with component content and document management solutions for over 20 years. He has worked with custom DTDs and Schemas, S1000D, iSpec2200, and DITA, together with various document and content management systems, including Documentum, SharePoint, XDocs, Ixiasoft, easyDITA, Astoria, and Vasont. His practical experience ranges from high-volume data capture and content creation to dynamic multichannel delivery in various industries, including banking and finance, manufacturing, central government, and education. In addition, Rik has participated in multiple consultancy projects and helped formulate innovative solutions throughout Europe and North America. Rik is also an advocate of Documentation 4.0, a new concept reflecting the demands on content/documentation that results from moving to Industry 4.0 and ‘smart’ manufacturing.
Recorded: April 13, 2022Presented by Michael Klemme and Klaus Fleischmann
Terminology is at the core of great content. When used consistently, it helps you communicate precisely and efficiently, which is an approach to your brand communication. It also makes your content accurate.
But it’s important to have a sound process in place to manage your terminology. Join us and learn how to:
Make sure all stakeholders have input into your terminology set
Set a process for consent and approval of your terms
Make sure you have a consistent and failsafe process to
Have all your terms for checking in Acrolinx, as the content quality solution of choice
Make sure all terminology action is concentrated in quickTerm, as the leading terminology system
Michael Klemme is a Senior Solutions Architect at Acrolinx. He advises new and existing customers on how they can efficiently integrate Acrolinx into their processes and helps partners to develop integrations.Klaus Fleischmann studied translation and IT in Vienna, holds an MA in Conference Interpreting from Monterey, California, and a MAS in Technical Communication from Krems, Austria. In 1996, he founded Austria-based Kaleidoscope, a company implementing content, translation, and terminology management processes for internationally active companies. Kaleidoscope develops online collaboration software for enterprise-level terminology workflow, translator query management, in-country review etc., making the translation quality process comprehensible and strategically manageable. In 2007, he became CEO of Austria´s leading LSP, Eurocom Translation Services GmbH. Always active in the industry, Klaus got voted into the Gala Board of Directors in 2015 and 2017.