Technology Readiness Assessment Guide

2y ago
7.36 MB
147 Pages
Last View : Today
Last Download : 1y ago
Upload by : Kaydence Vann

TechnologyReadinessAssessment GuideBest Practices for Evaluating the Readiness of Technologyfor Use in Acquisition Programs and ProjectsGAO-16-410GAugust 2016From August 11, 2016 to August 10,2017, GAO is seeking input and feedbackon this Exposure Draft from all interestedparties. See page 9 for more information.

Preface . 8Introduction . 11The Guide’s Case Studies . 13The Guide’s Readers . 13Acknowledgments. 14Chapter 1 . 15What Is a Technology Readiness Assessment (TRA)? . 15Definitions and Overview. 16TRAs Inform Technology Development and Identify PotentialConcerns . 18Overview of Technology Development and Its Relationship toAcquisition Programs . 18Technology Development Models . 23TRA’s Relationship to Program Management and Oversight . 24Tailoring TRAs for Different Purposes. 25Chapter 2 . 26Why TRAs Are Important and Understanding Their Limitations . 26Maturity of Technology at Program Start Is an ImportantDeterminant of Success . 28TRA Process Is a Mechanism That Informs Important AcquisitionFunctions . 29Understanding TRAs Can Help Practitioners and Consumers ofInformation . 31TRAs Are Snapshots in Time . 33Advancements in Technology Can Pose Challenges in ApplyingTRAs. 33Organizational Experience, Culture, and Bias Can Affect TRAs . 34Page 1DRAFT

TRAs Depend on the Quality and Availability of Credible Data . 36Chapter 3 . 38Best Practice: A Reliable Process for Conducting Credible TRAs . 38More Frequent Evaluations of Technology Maturity . 40High Quality TRAs. 41Chapter 4 . 43Best Practice: Including Technology Maturity Assessments in theProgram Strategy, Designing the TRA Plan, and Determining the Team. 43Technical Maturity Assessment Strategy. 43The Assessment Team . 46The TRA Plan . 48Chapter 5 . 52Best Practice: Selecting Critical Technologies. 52Critical Technologies Defined . 52Challenges in Selecting Critical Technologies . 54Steps for Selecting Critical Technologies . 55Identifying Other Important Technologies and ProgrammaticIssues . 65Chapter 6 . 68Best Practice: Evaluating Critical Technologies . 68Steps for Evaluating Critical Technologies . 68Relevant Information Must Be Used to Evaluate CriticalTechnologies . 74Operational Environment Is Key to Evaluating CriticalTechnologies . 74Creating Critical Technology Subsets for Exceptionally Large orComplex Programs . 75Page 2DRAFT

Chapter 7 . 77Best Practice: Preparing the TRA Report . 77The TRA Report . 77Steps for Preparing and Coordinating the TRA Report . 80Response to the TRA Report . 84How Dissenting Views Are Documented and Submitted . 84Chapter 8 . 86Best Practice: Using the TRA Results . 86How TRA Reports Are Used . 86TRAs for Governance Decisions . 87TRAs as Knowledge-building Exercises . 88Identification of Potential Areas of Concern and Risk . 89Early Technology Development . 91TRA Process Facilitates Information Sharing Opportunities . 92Basis for Developing a Technology Maturation Plan (TMP) forImmature Technologies . 93Chapter 9 . 95Best Practice: Preparing a Technology Maturation Plan (TMP) . 95Steps for Preparing a Technology Maturation Plan . 96Updating a Technology Maturation Plan . 98The Technology Maturation Plan Template .101Chapter 10 . 106Practices Are Evolving in Evaluating Software Systems and SystemsIntegration Using TRAs. 106Applying TRAs to Software Systems .106Software Embedded Technologies versus Software-onlyTechnologies .108Page 3DRAFT

Development of System-level Readiness Metrics .109Appendix I: Key Questions to Assess How Well ProgramsFollowed the Six Step Process for Developing Credible TRAs. 112Appendix II: Auditing Agencies and Their Websites . 117Appendix III: Case Study Backgrounds . 119Case Study 1: Immature Technologies Increases Risk, GAO-08-408 119Case Study 2: Assessments Provide Key Information, GAO-10-675 . 120Case Study 3: Space Programs Often Underestimate Costs, GAO-07-96. 120Case Study 4: Program Updates Can Change Critical Technologies,GAO-02-201 . 121Case Study 5: Identifying Back-up Critical Technologies, GAO-08467SP. 121Appendix IV: Experts Who Helped Develop This Guide . 123Appendix V: Contacts and Acknowledgments . 130GAO Contacts . 130Other Leadership Provided for This Project . 130Acknowledgments. 130Appendix VI: Examples of Various TRL Definitions andDescriptions by Organization . 131Appendix VII: Other Types of Readiness Levels . 137Appendix VIII: Agency Websites Where TRA Report ExamplesCan Be Found . 142References . 143Image Sources . 146Page 4DRAFT

TablesTable 1: Cost and Schedule Experiences for Products with Mature andImmature Technologies29Table 2: Six Steps for Conducting a Technology Readiness Assessment(TRA)39Table 3: Characteristics of High Quality Technology ReadinessAssessments (TRA)41Table 4: Software Implementation Project Work Breakdown Structure 60Table 5: Technology Readiness Levels (TRLs) Supporting Information forHardware and Software72Table 6: Example Program Management Tools Used with TechnologyReadiness Assessments (TRAs)90Table 7: Auditing Agency Websites117Table 8: GAO Reports Used As Case Study in the TRA Guide119Table 9: Experts Who Made Significant Contributions123Table 10: Experts Who Made Noteworthy Contributions124Table 11: DOD Technology Readiness Levels (2011)131Table 12: DOD Software Technology Readiness Levels (2009)132Table 13: NASA Hardware Technology Readiness Levels (2013)133Table 14: NASA Software Technology Readiness Levels (2013)134Table 15: DOE Technology Readiness Levels (2011)135Table 16: DOD Manufacturing Readiness Levels137Table 17: Integration Readiness Levels140Table 18: System Readiness Levels141FiguresFigure 1: Technology Readiness LevelsFigure 2: Phased Acquisition Cycle with Decision PointsPage 5DRAFT1719

Figure 3: Technology Readiness Assessment and Technology ReadinessLevel Limitations32Figure 4: Notional Depiction of the Integrated Schedule for a Program 46Figure 5: Four Steps for Selecting Critical Technologies56Figure 6: Common Elements of a Work Breakdown Structure57Figure 7: A Contract Work Breakdown Structure59Figure 8: A Process Flow Diagram (simplified)62Figure 9: Four Steps to Evaluate Critical Technologies69Figure 10: Technology Readiness Assessment Report Template79Figure 11: Five Steps to Prepare the Technology Readiness AssessmentReport81Figure 12: Acquisition Cycle with Technology Readiness Assessments atDecision Points for Governance88Figure 13: Five Steps to Prepare the Technology Maturation Plan96Best Practices ChecklistsBest Practice Checklist: TRA Team and Purpose, Scope, and Schedule forTRA Plan50Best Practice Checklist: Selecting Critical Technologies66Best Practice Checklist: Evaluating Critical Technologies76Best Practice Checklist: Preparing the TRA Report85Best Practice Checklist: Using the TRA Results93Best Practice Checklist: Preparing a TMP for Immature Technologies 104Best Practice Checklist: Evaluating Software Systems Using TRAsAbbreviationsAD2COTSDDR&EDODPage 6DRAFTAdvancement Degree of Difficultycommercial off the shelfDeputy Director for Research and EngineeringU.S. Department of Defense111

RLTBDTMPTETPMMTRATRLUSASMDCWBSU.S. Department of EnergyDefense Threat Reduction Agencyintegration readiness levelInternational Systems Readiness Assessment Communityof Interestinformation technologymajor automated information systemMilestone Decision AuthorityManufacturing Readiness LevelNational Aeronautics and Space AdministrationNational Polar-orbiting Operational EnvironmentalSatellite Systemresearch and development degree of difficultyrisk identification, integration, and ilitiessystems engineering master plansystem readiness leveltechnical baseline descriptiontechnology maturation plantechnology elementtechnology program management modeltechnology readiness assessmenttechnology readiness levelU.S. Army Space and Missile Defense Commandwork breakdown structureThis Guide is a work of the U.S. government and is not subject to copyrightprotection in the United States. The published product may be reproduced anddistributed in its entirety without further permission from GAO. However,because this work may contain copyrighted images or other material,permission from the copyright holder may be necessary if you wish toreproduce this material separately.Page 7DRAFT

PrefaceThe U.S. Government Accountability Office is responsible for, among otherthings, assisting the Congress in its oversight of the federal government,including agency acquisition programs and projects. Federal agencies spendbillions of dollars each year to develop, acquire, and build major systems,facilities, and equipment, including fighter aircraft, nuclear waste treatmentfacilities, electronic baggage screening equipment, and telescopes for exploringthe universe. Managing these complex acquisitions has been a long-standingchallenge for federal agencies.Many of the government’s most costly and complex acquisition efforts requirethe development of cutting-edge technologies and their integration into largeand complex systems. Such acquisition efforts may also use existingtechnologies, but in new applications or environments. At issue is not whetherto take risks, but rather where and how to take them so they can be managedmore effectively. For more than a decade, GAO has shown that using effectivemanagement practices and processes to assess how far a technology hasmatured and how it has been demonstrated are keys to evaluating its readinessto be integrated into a system and managed for risk in the federal government’smajor acquisitions.A technology readiness assessment (TRA) is a systematic, evidence-basedprocess that evaluates the maturity of hardware and software technologiescritical to the performance of a larger system or the fulfillment of the keyobjectives of an acquisition program. TRAs, which measure the technicalmaturity of a technology or system at a specific point in time, do not eliminatetechnology risk, but when done well, can illuminate concerns and serve as thebasis for realistic discussions on how to mitigate potential risks as programsmove from the early stages of technology development, where resourcerequirements are relatively modest, to system development and beyond, whereresource requirements are often substantial. In addition, TRAs help legislators,government officials, and the public hold government program managersaccountable for achieving their technology performance goals.This TRA guide (the Guide) is a companion to GAO’s Cost Estimating andAssessment Guide and its Schedule Assessment Guide. 1 With this Guide, GAO1GAO, Cost Estimating and Assessment Guide: Best Practices for Developing and Managing Capital ProgramCosts, GAO-09-3SP (Washington, D.C.: March 2009), and Schedule Assessment Guide: Best Practices for ProjectSchedules, GAO-16-89G (Washington, D.C.: December 2015).Page 8DRAFT

intends to establish a methodology based on best practices that can be usedacross the federal government for evaluating technology maturity, particularlyas it relates to determining a program or project’s readiness to move past keydecision points that typically coincide with major commitments of resources.Similar assessments can be made by technologists and program managers asknowledge-building exercises during the course of a project to help themevaluate technology maturity, gauge progress, and identify and manage risk.Existing TRA guidance in government agencies and industry may include similarstrategies for evaluating technology maturity, but no widely held or acceptedprocess exists for doing so. The science and technology, systems engineering,and program management communities each views technology readinessthrough its own lenses, which can make for variable and subjective TRA results.In addition, some agencies have deemphasized the use of TRAs or questionedtheir value. We hope that this Guide can help reinvigorate the use of TRAs inthose organizations.The Guide is intended to provide TRA practitioners, program and technologymanagers, and governance bodies throughout the federal government aframework for better understanding technology maturity, conducting credibletechnology readiness assessments, and developing plans for technologymaturation efforts. Organizations that have developed their own guidance canuse the Guide to support and supplement their practices. Organizations thathave not yet developed their own policies can use it to begin establishing theirown guidance. As a companion to GAO’s cost and schedule assessment guides,this Guide can also help GAO and other oversight organizations evaluateagencies’ basis for their conclusions and decisions about technology readiness.We intend to keep the Guide current. We welcome comments and suggestionsfrom experienced practitioners as well as recommendations from experts in thescience and technology community, systems engineering, and programacquisition disciplines. Please click on this link toprovide us with comments on the Guide.Page 9DRAFT

If you have any questions concerning the Guide, you may contact Dr. TimothyPersons at (202) 512-3000 or, or Mike Sullivan at (202) 5124841 or Contact points for GAO’s Office of CongressionalRelations and Office of Public Affairs may be found on the last page of thisGuide.Timothy M. Persons, Ph.D.Chief Scientist and DirectorCenter for Science, Technology, and EngineeringApplied Research and MethodsMichael J. SullivanDirectorAcquisition and Sourcing ManagementPage 10DRAFT

IntroductionTechnology readiness assessments (TRA)—evaluations used to determine atechnology’s maturity—have been used widely at the U.S. Department ofDefense (DOD) and National Aeronautics and Space Administration (NASA) sincethe early 2000s. Other government agencies, as well as industries in aerospace,maritime, oil and gas, electronics, and heavy equipment have also used TRAs tohelp manage their acquisitions. Few agencies have guides for assessing atechnology’s maturity and its readiness for integration into larger acquisitionprograms, and the federal government has not adopted a generally acceptedapproach for evaluating technology beyond using technology readiness level(TRL) measures. 2 This TRA Guide (referred to as the Guide) is intended to helpfill those gaps.The Guide has two purposes: (1) describe generally accepted best practices forconducting effective evaluations of technology developed for systems oracquisition programs, and (2) provide program managers, technologydevelopers, and governance bodies with the tools they need to more effectivelymature technology, determine its readiness, and manage and mitigate risk. 3 Inaddition, oversight bodies—such as those with department or agencyacquisition officials or government auditors—may use the Guide to evaluatewhether the fundamental principles and practices of effective TRAs are followedalong with the credibility, objectivity, reliability, and usefulness of thoseassessments.The Guide recognizes that TRAs have different customers within anorganization, such as the governance body charged with program oversight inmanaging and allocating fiscal resources as the gatekeeper, as well as a morenarrow audience, such as the program manager, technology developer, orindependent consultant that uses them to determine progress in achievingtechnical maturity. The Guide discusses TRAs in context of the full range of bestpractices to be used for go

Table 15: DOE Technology Readiness Levels (2011) 135 . Table 16: DOD Manufacturing Readiness Levels 137 . Table 17: Integration Readiness Levels 140 . Table 18: System Readiness Levels 141 . Figures

Related Documents:

Answer Key Question Number Reporting Category Readiness or Supporting Content Expectation Correct Answer Reading Selection 1 - Black Holes 1 1 Supporting 3.4C C 2 3 Readiness 3.13 Figure 19(E) A 3 3 Readiness 3.13B D 4 3 Readiness 3.13A C 5 3 Readiness 3.13C A 6 1 Readiness 3.4B D 7 3 Supporting 3.16 A

(JCAs), Army plan assessment, and readiness deficiencies) and six Army strategic readiness tenets (manning, equip-ping, sustaining, training, installations, and capacity and capability). The ASRA prepares the analysis by criteria, key indicators, and measures and develops the assessment through the Strategic Readiness Assessment Group (SRAG). The

technology readiness assessment must be conducted to ensure that the technologies meet the requirement for the upcoming phase [4, 5, 16]. C. Technology Readiness Assessment (TRA) Process The DODI 5000.2 establishes the requirement for the performance of Technology Readiness Assessment (

tem readiness are not yet implemented in any formal way Department of Defense (DoD)-wide. This article explains a method to combine Technology Readiness Level (TRL) (See Appendix, Table A-1), Integration Readiness Level (IRL), and Manufacturing Readiness Level (MRL) (See Appendix, Table A-2) into aCited by: 2Page Count: 29File Size: 1MB

The Readiness Assessment is for students in kindergarten or early fi rst grade who have not had any reading instruction. Use the Qualifying Assessment: Grade 1 and Up (found on pages 79-85) for students who have had reading instruction and need intervention. The Readiness Assessment is administered individually.

4. A test is not an appropriate form of assessment for Grade 1 baseline assessment. 5. It is not necessary to also complete a school readiness assessment as the Grade 1 initial BAP includes aspects of ‘school readiness assessment’. It is based on the Learning Outcomes and Assessment Standards of the previous grade (Grade R). 6.

The Florida College and Career Readiness Initiative is a statewide policy that mandates college placement testing of 11th-graders who meet high school graduation criteria but are unlikely to meet college readiness criteria. Students who score below college-ready on the Postsecondary Education Readiness Test (PERT) are required to .

Andreas Wimmer, Indra de Soysa, Christian Wagner Number 61 Political Science Tools for Assessing Feasibility and Sustainability of Reforms ZEF – Discussion Papers on Development Policy Bonn, February 2003. The CENTER FORDEVELOPMENT RESEARCH (ZEF) was established in 1997 as an international, interdisciplinary research institute at the University of Bonn. Research and teaching at ZEF aims to .