Software Measurement For DoD Systems: Recommendations For .

3y ago
15 Views
2 Downloads
294.37 KB
63 Pages
Last View : 1m ago
Last Download : 3m ago
Upload by : Averie Goad
Transcription

Technical ReportCMU/SEI-92-TR-019ESC-TR-92-019Software Measurement for DoD Systems:Recommendations for Initial Core MeasuresAnita D. CarletonRobert E. ParkWolfhart B. GoethertWilliam A. FloracElizabeth K. BaileyShari Lawrence Pfleeger

Technical ReportCMU/SEI-92-TR-019ESC-TR-92-019September 1992(Draft) /Helvetica /B -52 /UL .8/gray exch def/start exch def/rotval exch def/mode exch deffindfont /infont exch def/printme exch defSoftware Measurement for DoD Systems:Recommendations for Initial Core MeasuresAnita D. CarletonRobert E. ParkWolfhart B. GoethertWilliam A. FloracSoftware Process Measurement ProjectElizabeth K. BaileyInstitute for Defense AnalysesShari Lawrence PfleegerThe MITRE CorporationUnlimited distribution subject to the copyright.Software Engineering InstituteCarnegie Mellon UniversityPittsburgh, Pennsylvania 15213

This report was prepared for theSEI Joint Program OfficeHQ ESC/AXS5 Eglin StreetHanscom AFB, MA 01731-2116The ideas and findings in this report should not be construed as an official DoD position. It is published in theinterest of scientific and technical information exchange.(Draft) /Helvetica /B -52 /UL .8/gray exchFORdefTHE COMMANDER/start exch def/rotval exch def(signature/mode exchdef on file)findfont /infont exch def/printme exch defThomas R. Miller, Lt Col, USAFSEI Joint Program OfficeThis work is sponsored by the U.S. Department of Defense.Copyright 1996 by Carnegie Mellon University.Permission to reproduce this document and to prepare derivative works from this document for internal use isgranted, provided the copyright and “No Warranty” statements are included with all reproductions and derivativeworks.Requests for permission to reproduce this document or to prepare derivative works of this document for externaland commercial use should be addressed to the SEI Licensing Agent.NO WARRANTYTHIS CARNEGIE MELLON UNIVERSITY AND SOFTWARE ENGINEERING INSTITUTE MATERIALIS FURNISHED ON AN “AS-IS” BASIS. CARNEGIE MELLON UNIVERSITY MAKES NO WARRANTIES OF ANY KIND, EITHER EXPRESSED OR IMPLIED, AS TO ANY MATTER INCLUDING, BUT NOTLIMITED TO, WARRANTY OF FITNESS FOR PURPOSE OR MERCHANTIBILITY, EXCLUSIVITY, ORRESULTS OBTAINED FROM USE OF THE MATERIAL. CARNEGIE MELLON UNIVERSITY DOESNOT MAKE ANY WARRANTY OF ANY KIND WITH RESPECT TO FREEDOM FROM PATENT,TRADEMARK, OR COPYRIGHT INFRINGEMENT.This work was created in the performance of Federal Government Contract Number F19628-95-C-0003 withCarnegie Mellon University for the operation of the Software Engineering Institute, a federally funded researchand development center. The Government of the United States has a royalty-free government-purpose license touse, duplicate, or disclose the work, in whole or in part and in any manner, and to have or permit others to do so,for government purposes pursuant to the copyright license under the clause at 52.227-7013.This document is available through Research Access, Inc., 800 Vinial Street, Pittsburgh, PA 15212.Phone: 1-800-685-6510. FAX: (412) 321-2994. RAI also maintains a World Wide Web home page. The URL ishttp://www.rai.comCopies of this document are available through the National Technical Information Service (NTIS). For information on ordering, please contact NTIS directly: National Technical Information Service, U.S. Department ofCommerce, Springfield, VA 22161. Phone: (703) 487-4600.This document is also available through the Defense Technical Information Center (DTIC). DTIC provides accessto and transfer of scientific and technical information for DoD personnel, DoD contractors and potential contractors, and other U.S. Government agency personnel and their contractors. To obtain a copy, please contact DTICdirectly: Defense Technical Information Center / 8725 John J. Kingman Road / Suite 0944 / Ft. Belvoir, VA22060-6218. Phone: (703) 767-8222 or 1-800 225-3842.]Use of any trademarks in this report is not intended in any way to infringe on the rights of the trademark holder.

Table of ContentsList of ting Measurement with Software Processes2.1. Defining the Measurement Process2.2. Measurement and the Capability Maturity Model5573.Recommendations for Specific Measures3.1. The Basic Measures3.2. Size3.2.1. Reasons for using physical source line measures3.2.2. Specific recommendations for counting physical source lines3.3. Effort3.3.1. Reasons for using staff-hour measures3.3.2. Specific recommendations for counting staff-hours3.4. Schedule3.4.1. Reasons for using calendar dates3.4.2. Specific recommendations for using calendar dates3.5. Quality3.5.1. Reasons for counting problems and defects3.5.2. Specific recommendations for counting problems and defects991113151517181922232427284.Implementing the Basic Measures4.1. Initial Steps4.2. Related Actions for DoD Consideration4.3. From Definition to Action—A Concluding Note29293031References33Appendix A: Acronyms and TermsA.1. AcronymsA.2. Terms Used373738Appendix B: Illustrations of UseB.1. Establishing Project FeasibilityB.2. Evaluating PlansB.3. Tracking ProgressB.4. Improving the ProcessB.5. Calibrating Cost and Reliability Models414143495052Appendix C: A Proposed DoD Software Measurement Strategy53CMU/SEI-92-TR-19i

iiCMU/SEI-92-TR-19

List of FiguresFigure 1-1Convergence Between DoD and SEI Objectives2Figure 1-2Proposed SWAP Software Measurement Strategy—PrincipalIngredients2Relationships Between This Report and Its SupportingDocuments3Steps for Establishing a Software Measurement ProcessWithin an Organization5Figure 2-2Stages of a Measurement Process6Figure 2-3Relationship of Software Measures to Process Maturity8Figure 3-1Measures Recommended for Initial DoD Implementation9Figure 3-2A Part of the Recommended Definition for Physical SourceLines of Code11Figure 3-3Specifying Data for Project Tracking (A Partial Example)12Figure 3-4The Case of Disappearing Reuse13Figure 3-5Sections of the Recommended Definition for Staff-HourReports16Sections of the Schedule Checklist for Milestones, Reviews,and Audits19Figure 3-7Sections of the Schedule Checklist for CSCI-Level Products20Figure 3-8Example of a Report Form for System-Level Milestone Dates21Figure 3-9A Portion of the Definition Checklist for Counting Problemsand Defects25Figure 3-10A Portion of the Checklist for Defining Status Criteria26Figure 3-11A Portion of the Checklist for Requesting Counts of Problemsand Defects27Figure B-1Illustration of Effects of Schedule Acceleration42Figure B-2Indications of Premature Staffing43Figure B-3A More Typical Staffing Profile44Figure B-4Exposing Potential Cost Growth—The Disappearance ofReused Code44Figure B-5Project Tracking—The Deviations May Seem Manageable45Figure B-6Project Tracking—Deviations from Original Plan IndicateSerious Problems46Project Tracking—Comparisons of Developer’s Plans CanGive Early Warnings of Problems46Comparison of Compressed and Normal Schedules47Figure 1-3Figure 2-1Figure 3-6Figure B-7Figure B-8CMU/SEI-92-TR-19iii

ivFigure B-9Continually Slipping Milestones48Figure B-10Effects of Slipping Intermediate Milestones48Figure B-11Extrapolating Measurements to Forecast a Completion Date49Figure B-12Effects of Normal Schedules50Figure B-13Effects of Detecting Defects Early51Figure C-1Context for Initial Core Measures53CMU/SEI-92-TR-19

AcknowledgmentsSince 1989, the SEI has been assisted in its software measurement initiative by aMeasurement Steering Committee that consists of senior representatives from industry,government, and academia. The people on this committee have earned solid national andinternational reputations for contributions to measurement and software management. Theyhave helped us guide the efforts of our working groups so that we could integrate their workwith not only this report, but also our other software measurement activities. We thank themembers of the committee for their many thoughtful contributions. The insight and advicethese professionals have provided have been invaluable:William AgrestiThe MITRE CorporationJohn McGarryNaval Underwater Systems CenterHenry BlockUniversity of PittsburghWatts HumphreySoftware Engineering InstituteDavid CardComputer Sciences CorporationRichard MitchellNaval Air Development CenterAndrew ChruscickiUS Air Force Rome LaboratoryJohn MusaAT&T Bell LaboratoriesSamuel ContePurdue UniversityAlfred PeschelTRWBill CurtisSoftware Engineering InstituteMarshall PotterDepartment of the NavyJoseph DeanTecolote ResearchSamuel RedwineSoftware Productivity ConsortiumStewart FenickUS Army Communications-ElectronicsCommandKyle RoneIBM CorporationCharles FullerAir Force Materiel CommandRobert GradyHewlett-PackardJohn HardingBull HN Information Systems, Inc.Frank McGarryNASA (Goddard Space Flight Center)CMU/SEI-92-TR-19Norman SchneidewindNaval Postgraduate SchoolHerman SchultzThe MITRE CorporationSeward (Ed) SmithIBM CorporationRobert SulgroveNCR CorporationRay WolvertonHughes Aircraftv

As we prepared this report, we were aided in our activities by the able and professionalsupport staff of the SEI. Special thanks are owed to Linda Pesante and Mary Zoys, whoseeditorial assistance helped guide us to a final, publishable form; to Lori Race, whocoordinated our meeting activities and provided outstanding secretarial services; and toHelen Joyce and her assistants, who so competently assured that meeting rooms, lodgings,and refreshments were there when we needed them.And finally, this report could not have been assembled without the active participation andmany contributions from the other members of the SEI Software Process MeasurementProject and the SWAP measurement team who helped us shape these materials into formsthat could be used to support the DoD Software Action Plan:viJohn BaumertComputer Sciences CorporationDonald McAndrewsSoftware Engineering InstituteMary BusbyIBM CorporationJames RozumSoftware Engineering InstituteAndrew ChruscickiUS Air Force Rome LaboratoryTimothy ShimeallNaval Postgraduate SchoolJudith ClappThe MITRE CorporationPatricia Van VerthCanisius CollegeCMU/SEI-92-TR-19

Software Measurement for DoD Systems:Recommendations for Initial Core MeasuresAbstract. This report presents our recommendations for a basic set of softwaremeasures that Department of Defense (DoD) organizations can use to help plan andmanage the acquisition, development, and support of software systems. Theserecommendations are based on work that was initiated by the Software MetricsDefinition Working Group and subsequently extended by the SEI to support the DoDSoftware Action Plan. The central theme is the use of checklists to create and recordstructured measurement descriptions and reporting specifications. These checklistsprovide a mechanism for obtaining consistent measures from project to project and forcommunicating unambiguous measurement results.1. IntroductionIn its 1991 Software Technology Strategy [DoD 91], the Department of Defense (DoD) setthree objectives to be achieved by the software community by the year 2000: Reduce equivalent software life-cycle costs by a factor of two. Reduce software problem rates by a factor of ten. Achieve new levels of DoD mission capability and interoperability via software.To achieve these objectives, the DoD needs a clear picture of software developmentcapabilities and a quantitative basis from which to measure overall improvement. Withquantitative information, national goals can be set to help keep the entire communitycompetitive and focused on continuous improvement of products and processes. This is notpossible today. Few organizations have a comprehensive, clearly defined softwaremeasurement program, and measurement is frequently done in different ways. Becausethere are no standard methods for measuring and reporting software products andprocesses, comparisons across domains or across the nation are impossible. A UScompany cannot know if its software quality is better or worse than the national averagebecause no such national information is available. The meters, liters, and grams available asstandards in other disciplines are missing, and there is seldom a clear understanding of howa measure on one software project can be compared or converted to a similar measure onanother.The Software Technology Strategy has now been made part of a larger DoD initiative calledthe Software Action Plan (SWAP). This plan establishes 17 initiatives related to developingand managing software systems. One of its initiatives is to define a core set of measures foruse within DoD software projects. The Software Engineering Institute (SEI) was asked tolead this initiative because there was a natural convergence between DoD objectives andwork that the SEI already had underway (Figure 1-1).

SoftwareAction igure 1-1 Convergence Between DoD and SEI ObjectivesPrincipal Components of the Software Measurement StrategyDiscussed by the SWAP Working GroupShort TitleSubjectSEI Core SetRecommendations for Initial Core MeasuresSTEPArmy Software Test and Evaluation Panel—Software Metrics InitiativesAFP 800-48Acquisition Management—SoftwareManagement IndicatorsMIL-STD-881BWork Breakdown Structures for DefenseMateriel ItemsI-CASEIntegrated Computer-Aided SoftwareEngineeringSTARSSoftware Technology for Adaptable, ReliableSystemsCMMCapability Maturity Model for Software

Figure 1-2 Proposed SWAP Software Measurement Strategy—Principal IngredientsThe tasks assigned to the SEI were to prepare materials and guidelines for a set of basicmeasures that would help the DoD plan, monitor, and manage its internal and contractedsoftware development projects. These materials and guidelines would provide a basis forcollecting well-understood and consistent data throughout the DoD. They would also supportother measurement activities the DoD is pursuing. Figure 1-2 on the facing page lists someof the principal components of the measurement strategy the SWAP Working Group hasbeen discussing. The timelines associated with this strategy are presented in Appendix C.The memorandum of understanding that initiated the SWAP measurement work called forthe SEI to build upon existing draft reports for size, effort, schedule, and qualitymeasurement that had been prepared by the Software Metrics Definition Working Group.These drafts were distributed for industry and government review in the fall of 1991. Wehave now extended that work, guided by the comments we have received; and our resultsare presented in three “framework” documents that are being published concurrently with thisreport [Park 92], [Goethert 92], [Florac 92]. These documents provide methods for clearlycommunicating measurement results. They include measurement definitions; checklists forconstructing alternative definitions and data specifications; instructions for using thechecklists to collect, record, and report measurement data; and examples of how the resultscan be used to improve the planning and management of software projects. It is from theframework documents that the recommendations in this report are drawn. The frameworkdocuments should be viewed as companion reports by anyone seeking to implement therecommendations presented herein. Figure 1-3 shows the interrelationships among theseSoftware SizeMeasurement:A Framework nWorking oupDraftDocumentsDraft DocumentsSoftware Effort &ScheduleMeasurement:A Framework forCountingStaff Hours andReporting ScheduleInformationSoftwareMeasurement forDoD Systems:Recommendationsfor Initial CoreMeasuresSoftware QualityMeasurement:A Framework forCounting Problemsand DefectsFigure 1-3 Relationships Between This Report and Its Supporting Documents

reports.The starting point for our measurement definition work has been management’s need foranswers to several key questions that are present in any software project: How large is the job? Do we have sufficient staff to meet our commitments? Will we deliver on schedule? How good is our product? How are we doing with respect to our plans?To address these questions, we have concentrated on developing methods for obtainingunambiguous measures for size, effort, schedule, and quality. Reliable assessments ofthese characteristics are crucial to managing project commitments. Measures of thesecharacteristics also serve as foundations for achieving improved levels of process maturity,as defined in the SEI Capability Maturity Model for Software [Humphrey 89], [Paulk 91],[Weber 91].The objective of our measurement work is to assist program managers, project managers,and government sponsors who want to improve their software products and processes. Thepurpose of the recommendations in this report and its supporting framework documents is toprovide operational mechanisms for getting information for three important managementfunctions: Project planning—estimating costs, schedules, and defect rates. Project management—tracking and controlling costs, schedules, and quality. Process improvement—providing baseline data, tracing root causes of problems anddefects, identifying changes from baseline data, and measuring trends.The measures we recommend in this report form a basis from which to build acomprehensive measurement and process improvement program. We support thesemeasures with structured methods that can help organizations implement clear andconsistent recording and reporting. The methods include provisions for capturing theadditional details that individual organizations need for addressing issues important to localprojects and processes.A Note on Implementation PolicyOur understanding is that the DoD plans to implement the recommendations in this report.Although we expect to be assisting the DoD in this endeavor, responsibility forimplementation rests with the Department of Defense.Questions with respect toimplementation policy and directives should be referred to the appropriate DoD agencies.

2. Integrating Measurement with Software ProcessesCollecting and using even the most basic measures in ways that are meaningful will prove tobe a challenge for many organizations. Although some projects already collect forms of themeasures we recommend and a number of others as well, it is also true that many measurement efforts have failed because they attempted to collect too much too soon [Rifkin 91].This chapter describes an implementation strategy that addresses both the challenge andplanning of software measurement. The strategy stresses foundations that must be laid ifmeasurement is to be successful.2.1. Defining the Measurement ProcessMeasurement definitions like those in our framework documents address but one part of ameasurement program. A broader process and process infrastructure is needed to establishsuccessful software measurement within an organization. Figure 2-1 shows the sequence oftasks that should be performed [McAndrews 92]. Organizations often tend to overlook thefirst two steps and jump immediately to prototyping or collecting data. When this happens,measurement is apt to become viewed as just another cost rather than as an integral part ofmanagement and process improvement.Develop strategyEstablish processPrototype processEstablish policyEstablish officeExpand programFigure 2-1 Steps for Establishing a Software Measurement Process Within an OrganizationIn the context of Figure 2-1, the Establish process step entails identifying and integrating aconsistent, goal-related, measurement process into an organization’s overall software

methodology. Figure 2-2 provides a top-level view of the stages that comprise successfulmeasurement processes [McAndrews 92]. Organizations begin the process with specificmanagement needs. During the first stage, they identify the measurements they mu

Software Measurement for DoD Systems: Recommendations for Initial Core Measures . Shari Lawrence Pfleeger The MITRE Corporation (Draft) /Helvetica /B -52 /UL .8 /gray exch def /start exch def . recommendations are based on work that was initiated by the Software Metrics

Related Documents:

The US DoD has two PKI: DoD PKI is their internal PKI; DoD ECA PKI is the PKI for people outside of the DoD [External Certification Authority] who need to communicate with the DoD [i.e. you]. Fortunately, the DoD has created a tool for Microsoft to Trust the DoD PKI and ECA PKI; the DoD PKE InstallRoot tool.File Size: 1MBPage Count: 10

The DoD PKI consists of the US DoD issuing certificates internally to US DoD end entities (like DoD employees and DoD web sites). The ECA PKI consists of vendors that are authorized by the US DoD to issue certificates to end entities outside of the US DoD that need to communicate with the DoD. You probably need to trust both the DoD PKI and ECA .

Bruksanvisning för bilstereo . Bruksanvisning for bilstereo . Instrukcja obsługi samochodowego odtwarzacza stereo . Operating Instructions for Car Stereo . 610-104 . SV . Bruksanvisning i original

10 tips och tricks för att lyckas med ert sap-projekt 20 SAPSANYTT 2/2015 De flesta projektledare känner säkert till Cobb’s paradox. Martin Cobb verkade som CIO för sekretariatet för Treasury Board of Canada 1995 då han ställde frågan

service i Norge och Finland drivs inom ramen för ett enskilt företag (NRK. 1 och Yleisradio), fin ns det i Sverige tre: Ett för tv (Sveriges Television , SVT ), ett för radio (Sveriges Radio , SR ) och ett för utbildnings program (Sveriges Utbildningsradio, UR, vilket till följd av sin begränsade storlek inte återfinns bland de 25 största

Hotell För hotell anges de tre klasserna A/B, C och D. Det betyder att den "normala" standarden C är acceptabel men att motiven för en högre standard är starka. Ljudklass C motsvarar de tidigare normkraven för hotell, ljudklass A/B motsvarar kraven för moderna hotell med hög standard och ljudklass D kan användas vid

LÄS NOGGRANT FÖLJANDE VILLKOR FÖR APPLE DEVELOPER PROGRAM LICENCE . Apple Developer Program License Agreement Syfte Du vill använda Apple-mjukvara (enligt definitionen nedan) för att utveckla en eller flera Applikationer (enligt definitionen nedan) för Apple-märkta produkter. . Applikationer som utvecklas för iOS-produkter, Apple .

version of Outlook: 1. Open Microsoft Outlook, and select the "Home" tab. What is the DoD ID Number? The DoD ID Number is a unique number assigned to all U.S Department of Defense (DoD) Civilian, U.S. Military, and DoD Contract personnel with a Common Access Card (CAC). For these personnel, their DoD ID number is synonymous with their