A Practitioner View Of CMMI Process Performance Models

2y ago
16 Views
2 Downloads
5.23 MB
133 Pages
Last View : 19d ago
Last Download : 3m ago
Upload by : Lucca Devoe
Transcription

A Practitioner View ofCMMI ProcessPerformance ModelsSoftware Engineering InstituteCarnegie Mellon UniversityPittsburgh, PA 15213Robert Stoddard and Rusty YoungMarch 20, 2008 2008 Carnegie Mellon University

Permission to use SAS JMP Screen ShotsScreen shots and other statistical tool information have been used withpermission from SAS Institute. Information about JMP statistical discoverysoftware can be found at www.jmp.com.www jmp comJMP is interactive, comprehensive, visual software from SAS. Itdynamically links statistics with graphics right on your WindowsWindows, Macintosh,Macintoshor Linux desktop, empowering you to explore data interactively and bringunderstanding to your organization.SAS and all other SAS Institute Inc. product or service names areregistered trademarks or trademarks of SAS Institute Inc. in the USA andother countries.countries indicates USA registrationregistration.Other brand and product names are trademarks of their respectivecompanies. Copyright 2007 SAS Institute Inc. All rights reserved.449113.0607Robert StoddardRusty Young 2008 Carnegie Mellon University2

Permission to use Crystal Ball and MinitabScreen ShotsPortions of the input and output contained in this moduleprinted with ppermission of Oracle ((formerlyymanual are pDecisioneering). Crystal Ball 7.2.2 (Build 7.2.1333.0) is usedto capture screenshots in this module.The Web page for Crystal Ball is available athttp://www.crystalball.comPortions of the input and output contained in thispresentation are printed with permission of Minitab Inc.using version 15Minitab company web page is http://www.minitab.comRobert StoddardRusty Young 2008 Carnegie Mellon University3

TopicsPurpose of this TutorialThe Proposal PhaseProject Management Planning The Use of S curves Escaped Defect Analysis ModelingPerformance Models in RequirementsDesignBuildSystem TestSummaryRobert StoddardRusty Young 2008 Carnegie Mellon University4

Purpose of this Tutorial 2008 Carnegie Mellon University

PurposeThis tutorial is meant to inform practitioners of the: Essential Ingredients of CMMI Process Performance Models Examples of CMMI Process Performance Models across thelifecycle Examples of methods to implement various quantitative models forCMMI Process Performance ModelsRobert StoddardRusty Young 2008 Carnegie Mellon University6

Essential Ingredients of CMMI ProcessPerformance ModelsStatistical, probabilistic or simulation in naturePredict interim and/or final project outcomesUse controllable factors tied to sub-processes to conduct the predictionModel the variation of factors and understand the predicted range orvariation of the outcomesEnable “what-if” analysisyfor pprojectjpplanning,g, dynamicyre-planningpg andproblem resolution during project executionConnect “upstream” activity with “downstream” activityEnable projects to achieve mid-course corrections to ensure projectsuccessRobert StoddardRusty Young 2008 Carnegie Mellon University7

All Models (Qualitative and Quantitative)Quantitative Models (Deterministic, Statistical, Probabilistic)Statistical or Probabilistic ModelsAnecdotalBiasedsamplesInterim outcomes predictedControllable x factors involvedProcess PerformanceModel With controllable xfactors tied toProcesses and/orSub-processesQQualOnly phasesOor lifecyclesare modeledOnlyuncontrollablefactors aremodeledNouncertaintyor variationOnlyy final modeledoutcomesaremodeled 2008 Carnegie Mellon University

When and Why Do We Need ProcessPerformance Models?Software CodingSoftware Unit TestingSoftwareDesignSystemsTestingIntegration t StoddardRusty Young 2008 Carnegie Mellon University9

The Proposal 2008 Carnegie Mellon University

The ProposalOften uses higher level PPMs False precision when using lower Better understanding of the risk in a bidResults my be used for bid/no-bid decisions along with othercriteriaRobert StoddardRusty Young 2008 Carnegie Mellon University11

creenSystem tInquiryInput & all tabaseCRUDOutputLogical FileInterfaceInput & OutputMechnicalArmControllerInput & OutputCashJournalRUDAccountingSystemFP Type#WeightTotalInputsOutputsInquiriesLogical FilesInterfaces5721145410720358107TotalBar CodeLabel MakerWholesaler80Robert StoddardRusty Young 2008 Carnegie Mellon University12

Function Point EstimateThe function point estimate based on the context diagramresults in 80 function points or 10,240 lines of codeHowever, two other context diagrams based on the proposalinformation resulted in estimates of 73 function points, or 9,344 lines of code and 96 function points or 12,288 lines of codeThis variation in the estimates for the proposed system willbe used for the process performance model (PPM) basedpredictions for the proposal and managing the projectRobert StoddardRusty Young 2008 Carnegie Mellon University13

Composition Trade-offs and PPM logyProcess/SubprocessGeneral ExperienceDomain ExperiencePlatform rt StoddardRusty Young 2008 Carnegie Mellon University14

Understanding Distributions – Key to Informed Decisions123456789 10Robert StoddardRusty Young 2008 Carnegie Mellon University15

Distributions Describe VariationPopulations of data are characterized as distributions inmost statistical procedures: expressed as an assumption for the procedure can be represented using an equationThe following are examples of distributions you may comeacross:TriangularRobert StoddardRusty Young 2008 Carnegie Mellon University16

ABCrystal Ball uses arandom numbergenerator to selectvalues for A and B111212 3 4 512 3 4 5Crystal Ball thenallows the user toanalyze andinterpret the finaldistribution of C!925834A B 32 3 412 3 4 512 3 4 5CCC ysta BallCrystalacauses Excel torecalculate allcells, and then itsaves off thedifferent resultsfor C!1 2 3 4 5 6 7 8 9 10Robert StoddardRusty Young 2008 Carnegie Mellon University17

Why is Understanding Variation Important?Customer wants the product in 10 weeksHistorical range is 9-11 weeksShould the job be accepted?Probably NotProbably ShouldRobert StoddardRusty Young 2008 Carnegie Mellon University18

Variation, Trade-offs, and PPMsFunction Pt Est. SLOCCalculated KDSI00Leave as is unless your have tunedEnter a2.45 your parametersB - do not change1.01Th shadedTheh d d cellsll are wherehtheth effectsff Leavet off variationti areincorporatedi have tunedt das isi unlessyourEnter cyoursimulationparametersusing a Monte2.66CarloEnter ScheduleCompression/ExpansionDefaults to 100 (no compression orPercentage100 expansion)Nominal Effort 0.0Effort Multiplier 0.0Eff t (MM) Effort000.0Nominal Schedule Staff Potential # DefectsLatent Defects0.0#DIV/0!00 850.850 910.910 950.950Robert StoddardRusty Young 2008 Carnegie Mellon University019

Evaluate Proposal Risk and NegotiateRun “what if” exercises holding one or more values constantSee effects of trade-offstrade offs between ScheduleEffortDefectsStaffFunctionalityyRobert StoddardRusty Young 2008 Carnegie Mellon University20

Variation, Trade-offs, and PPMs – ScheduleForecast: Schedule Percentile Forecast values0%7.610%9.420%9.730%10.040%10.250%10 410.460%10.670%10.880%11.190%11.5100%14.0Robert StoddardRusty Young 2008 Carnegie Mellon University21

Variation, Trade-offs, and PPMs – EffortForecast: Effort (MM) Percentile Forecast values0%18.910%34.120%37.430%40.040%42.350%44 644.660%47.170%49.880%53.290%58.4100%99.8Robert StoddardRusty Young 2008 Carnegie Mellon University22

Variation, Trade-offs, and PPMs – DefectsForecast: Latent DefectsPercentile Forecast values0%5.6310%8.6420%9.6930%10.5040%11.2050%11 bert StoddardRusty Young 2008 Carnegie Mellon University23

Variation, Trade-offs, and PPMs – StaffForecast: Staff Percentile Forecast 0%4.680%4.890%5.1100%7.2Robert StoddardRusty Young 2008 Carnegie Mellon University24

Proposal CAR/OID to Mitigate RiskSeeing if there are new technologies that if employed willreduce riskMay build/modify PPM to evaluate impact and ROI May involve a brief pilotM involveMayil iindustrydddataMay involve professionalEach brings their own level of uncertainty to the predictionTypically involves detailed project planning PPMs Results at micro Extrapolate to macroRobert StoddardRusty Young 2008 Carnegie Mellon University25

Proposal CAR/OIDNew technology will increase coding productivity by 10% May want to verify with– pilot– in-depth testing Measured resultsAdjust proposal model with resultsRe-predict and evaluate resulting risksRobert StoddardRusty Young 2008 Carnegie Mellon University26

Plan ProjectLike proposal More detail Interim as well as end stateCompose a PDP and construct an initial PPM to ensure itwill meet our goals and aid us managing the projectRobert StoddardRusty Young 2008 Carnegie Mellon University27

Initial PPMNote: the greenish shaded cells on this andsucceeding slides are where variations will beaccounted for usingg a Monte Carlo simulationPhaseProposal/Early PlanningElicit RequirementsURD ReviewAnalyze RequirementsSRS ReviewD iDesignDesign ReviewCodeCode ReviewTestDeliverUoMFunction PointsUser DefectsComponentsDefectsTest CasesDefectsSizeEffort80110176124110229Robert StoddardRusty Young 2008 Carnegie Mellon University1547236518099822367229506526331080628

Initial PPMSize and Effort are predicted functions:SRSsize ( (URDsize, method,, review type.yp Etc.))Effort (Documentsize, method, experience, etc.)PhaseUoMSizeEffortProposal/Early Planning Function Points80154Elicit RequirementsUser Requirements110723URD ReviewDefects65Analyze RequirementsRequirements1761809SRS ReviewDefects98D iDesignCComponentst1242236Design ReviewDefects72CodeComponents1102950Code ReviewDefects65TestTest Cases2292633DeliverDefects10806Robert StoddardRusty Young 2008 Carnegie Mellon University29

Initial PPMSize and Effort are predicted functions:SRSsize ( (URDsize, method,, review type.yp etc.))Effort (Documentsize, method, experience, etc.)PhaseUoMSizeEffortProposal/Early Planning Function Points80154Elicit RequirementsUser Requirements110723URD ReviewDefects65Analyze RequirementsRequirements (method experience training)1761809Effort constant multiplier* sizeSRS ReviewDefects98 (domain, (domaincustomerplatform general)DExperienceDesigniC customer,Componentst1242236Design ReviewDefects72CodeComponents1102950Code ReviewDefects65TestTest Cases2292633DeliverDefects10806Robert StoddardRusty Young 2008 Carnegie Mellon University30

Initial PPMPredicted DefectsURD Defects SRS Defects DES Defects Code Defects Latent 1375516Robert StoddardRusty Young 2008 Carnegie Mellon University223893811523021212521013031

Initial PPMURDSRS134415351746198278357Rework 115148Robert StoddardRusty Young 2008 Carnegie Mellon University13443301147116620867111532

PDP RiskRobert StoddardRusty Young 2008 Carnegie Mellon University33

PDPRiskRobert StoddardRusty Young 2008 Carnegie Mellon University34

An Alternate ExampleProcess PerformanceModel (PPM) to supportEscaped DefectAnalysis andMonitoring 2008 Carnegie Mellon University

The Situation during DevelopmentDefects escaping from one development phase to the nextpto find,, diagnosegand fix. Someare veryy expensiveindustrial studies suggest the increasing cost may beexponential.OUR NEED: A PPM used by the software project managerand quality team to analyze escaping defect rates by type tosupport more informed decisions on where to target dynamicproject corrective action, as well as, changes toorganizationalgpprocesses!Robert StoddardRusty Young 2008 Carnegie Mellon University36

Details of the Escaping Defect PPMThe outcome, Y, is the amount of escaped defects by typephase of developmentpwithin each pThe x factors used in this model will be the various injectionanddddetectiont ti ratest byb ttype off defectd f t across theth phaseshoffdevelopmentNot only will this model focus on phase containment ofReq’ts, Design, and Code phases, but on the phasescreeningi off defectsd f t byb typetwithinithi ththe diffdifferentt ttypes offtestingRobert StoddardRusty Young 2008 Carnegie Mellon University37

Background Information on the DataHistorical data on escaped defects, by type, across lifecyclephases was recorded.pFor each historical project, software size was recorded, aswell,ll tto hhelpl normalizeli ththe ddefectsf t iinjectedj t d andd ffound,dthereby producing injection and detection rates.Robert StoddardRusty Young 2008 Carnegie Mellon University38

A modern spreadsheet for escapeddefect analysis before beingtransformed into a CMMI ProcessPerformance ModelRobert StoddardRusty Young 2008 Carnegie Mellon University39

Let’s look at thematrix showing“Phase Injected”vs “Phase Found”Robert StoddardRusty Young 2008 Carnegie Mellon University40

For example, anaverageg of 2000design defectswere found duringthe design activityRobert StoddardRusty Young 2008 Carnegie Mellon University41

Let’s look at the“Phase Containment”& “Phase Screening”ratesRobert StoddardRusty Young 2008 Carnegie Mellon University42

Here, 2080 Requirementsand Design defects werecaught during DesignHere, 2000 Design defectswere caughtg duringg Designg% of all defects entering andinjected in Design caught inDesign% of Design defects caughtin DesignRobert StoddardRusty Young 2008 Carnegie Mellon University43

Let’s look at the PhaseInjection and EscaperatesRobert StoddardRusty Young 2008 Carnegie Mellon University44

Here, 4200 Design defects were injected with2200 of them escaping the Design activity;Additionally, 2450 total defects (injectedduring Design or inherited from upstreamactivities) escaped past the Design activityRobert StoddardRusty Young 2008 Carnegie Mellon University45

Here, 36% of all defects in a project are expected to be Designdefects; 52% of Design defects are expected to escape pastDesign; and 54% of all types of defects in the Design activity(i j t d during(injectedd i DesignD i or inheritedi h it d fromfupstreamtactivities)ti iti )are escaping past the Design activityRobert StoddardRusty Young 2008 Carnegie Mellon University46

Now, let’s transformthis spreadsheetmodel into a validCMMI processperformance model!Robert StoddardRusty Young 2008 Carnegie Mellon University47

Robert StoddardRusty Young 2008 Carnegie Mellon University48

Robert StoddardRusty Young 2008 Carnegie Mellon University49

Each of the greencells have receiveduncertaintydistributions basedon historical dataRobert StoddardRusty Young 2008 Carnegie Mellon University50

Robert StoddardRusty Young 2008 Carnegie Mellon University51

Each of these bluecells were identified asoutcomes whoseresulting distributionswill be studiedRobert StoddardRusty Young 2008 Carnegie Mellon University52

Each of these blue cells wereidentified as outcomes whoseresulting distributions will be studiedRobert StoddardRusty Young 2008 Carnegie Mellon University53

Robert StoddardRusty Young 2008 Carnegie Mellon University54

StandardsimulationsummaryresultsRobert StoddardRusty Young 2008 Carnegie Mellon University55

We are 95%Wconfident that nomore thanapprox. 4,786Design defectswill be injectedby a projectRobert StoddardRusty Young 2008 Carnegie Mellon University56

We are 95%confident that nomore than 39%of all types ofdefects will beDesign defectsRobert StoddardRusty Young 2008 Carnegie Mellon University57

We are 95%confident that nomore than 22,351351Design defectswill escape theDesign activityRobert StoddardRusty Young 2008 Carnegie Mellon University58

We are 95%Wconfident that nomore than 61% ofDesign defects willescape the DesignactivityRobert StoddardRusty Young 2008 Carnegie Mellon University59

We are 95%confident that nomore than 2,607defects (injected( jduring Design orinherited fromupstream activitiesof Design) willescape the Designactivityi iRobert StoddardRusty Young 2008 Carnegie Mellon University60

We are 95%confident that nomore than 62% ofthe total defects(injected duringDesign or inheritedfrom upstreamactivities of Design)will escape theDesign activityRobert StoddardRusty Young 2008 Carnegie Mellon University61

We are 95%W9 % confidentfidthat no less than 1,499total defects (injected( jduring Design orinherited fromupstream activities ofDesign) will be foundduring the Designactivityti itRobert StoddardRusty Young 2008 Carnegie Mellon University62

We are 95% confidentthat no less than 38%of total defects(injected during Designor inherited fromupstream activities ofDesign) will be foundduring the DesignactivityRobert StoddardRusty Young 2008 Carnegie Mellon University63

We are 95% confidentthat no less than 47Design defects will befound during theDesign activityRobert StoddardRusty Young 2008 Carnegie Mellon University64

We are 95% confidentthat no less than 39% ofDesign defects will befound during the Designactivityti itRobert StoddardRusty Young 2008 Carnegie Mellon University65

We are 95% confidentthat no more than 46% ofall defects will escape atleast one phaseRobert StoddardRusty Young 2008 Carnegie Mellon University66

We are 95% confidentthat escaping defectswill not escape, onaverage more than 1average,1.88phasesRobert StoddardRusty Young 2008 Carnegie Mellon University67

Using SensitivityAnalysis, we canlearn more aboutwhich factors inour model aremostt contributingt ib tito our outcome.Robert StoddardRusty Young 2008 Carnegie Mellon University68

What Have We Accomplished?We transformed a model that used only historical averagesand substituted uncertaintyy distributions for each of theinjection and found ratesBy doingBd i thithis, we can establisht bli h confidentfid t conclusionsl iaboutb tour outcomes using their resulting distributions: Defect Injectionjrates byy Phase Phase Containment and Screening EffectivenessWe alsoWl usedd sensitivityiti it analysisl i tto ddecideid whichhi h ffactorstttotackle first to improve each outcomeRobert StoddardRusty Young 2008 Carnegie Mellon University69

Planning theRequirements Buildup 2008 Carnegie Mellon University

Generate PlanGenerate schedule based on higher level PPMs which helpdetermine milestones and variation base slackGenerate detailed PPMs to predict performance during thisphaseNote: these steps will be repeated periodically (such as atphase or other selected milestones) and on an as neededbasisRobert StoddardRusty Young 2008 Carnegie Mellon University71

Elicitation – Requirements Buildup – Predicted120Generalized Logistic (or Richard's)10080 60 40 20 0x time.timeA controls the lower asymptote,C controls the upper asymptote,M controlst l ththe titime off maximumigrowth,B controls the growth rate, andT controls where maximum growthoccurs - nearer the lower or upperasymptote1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26Robert StoddardRusty Young 2008 Carnegie Mellon University72

Elicitation – Requirements Buildup – Predicted120Generalized Logistic (or Richard's)100These typically would befunctions with controllable factorssuch as elicitation method (JAD,60prototyping, reengineering, etc.),team experience,p, domain40experience, # staff, etc.80 20 0x time.timeA controls the lower asymptote,C controls the upper asymptote,M controlst l ththe titime off maximumigrowth,B controls the growth rate, andT controls where maximum growthoccurs - nearer the lower or upperasymptote1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26Robert StoddardRusty Young 2008 Carnegie Mellon University73

Elicitation – Requirements Buildup – Predicted120Generalized Logistic (or Richard's)10080Calibration of this model for60 different domains, customers, etc.and the effects of the controllablefactors is critical40 20 0x time.timeA controls the lower asymptote,C controls the upper asymptote,M controlst l ththe titime off maximumigrowth,B controls the growth rate, andT controls where maximum growthoccurs - nearer the lower or upperasymptote1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26Robert StoddardRusty Young 2008 Carnegie Mellon University74

Risk in PlanUse PPMs to judge overall risk in the planMay use Monte Carlo simulation in the schedule to betterunderstand Schedule based sources of risk EffectsEffoff risksi k on theh scheduleh d lRobert StoddardRusty Young 2008 Carnegie Mellon University75

Elicitation – Requirements Buildup – Monitor120 100 80 6040 2001 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26Monitor the buildupFlattening means you arereaching the point of diminishingreturns for elicitationSignificant difference betweenpredicted and actual upperasymptote indicate a potentialmisunderstanding of the systemto be builtIf actuals show significantvariation from predicted, re fitcurve for new prediction– Calculate an appropriatePrediction Interval (PI) to aid indetection of anomalous conditionsRobert StoddardRusty Young 2008 Carnegie Mellon University76

Elicitation – Requirements Buildup – Example 22324-20Robert StoddardRusty Young 2008 Carnegie Mellon University7725

Elicitation – Requirements Buildup – Example 2a120100The Process is notperforming at its’ historicalLevels. Investigate andcalibrate 4-20Robert StoddardRusty Young 2008 Carnegie Mellon University7825

Elicitation – Requirements Buildup – Example 2b1401201008060The refit curve hasextended the predictedbuildup time byapproximately 20% (31vs 627282930-20Robert StoddardRusty Young 2008 Carnegie Mellon University7931

Elicitation – Requirements Buildup – NotesRequires a strong consistent requirements elicitationpprocess Different standard curves for different elicitation processes such asJAD, prototyping, etc. Curve shape parameters can be influenced by context -- rereengineering vs green field, well understood vs non-well understooddomain, experience, etc.Consider a measurement systems error analysis – perhapsusing Gage R&R to ensure consistent buildup countsRequires a good prediction of sizeCan be beneficial with good size prediction and then fittingthe curve as you gather data It will take time before variation in the buildup time minimizesRobert StoddardRusty Young 2008 Carnegie Mellon University80

Another ExampleProcess PerformanceModel (PPM) in theRequirements Phase 2008 Carnegie Mellon University

The Situation in the Requirements PhaseOur products are comprised of 40-60 featuresWe assign each feature a small development team todevelop the feature “cradle to grave”These feature teams operate in overlapping lifecycles withinan overall product incremental waterfall lifecycle model (thus(thus,different features will be added in each new increment)OUR NEED: A PPM that will let each feature team predictthe number of requirements defects to be experiencedthroughout the lifecycle of the feature developmentRobert StoddardRusty Young 2008 Carnegie Mellon University82

Details of the Requirements Phase PPMThe outcome, Y, is the predicted number of Requirementsgiven feature teamdefects for a gThe x factors used to predict the Requirements defects are:x1: Req’ts Volatility (continuous data)x2: Risk of Incomplete Req’ts (nominal data)x3:3 RiRiskk off AAmbiguousbiRReq’ts’t ((nominali lddata)t )x4: Risk of Non-Testable Req’ts (nominal data)x5: Risk of Late Req’tsReq ts (nominal data)Robert StoddardRusty Young 2008 Carnegie Mellon University83

Background Information on the DataWe collected historical data (of the Y and the x’s) for a largevolume of feature teamsFor the x2 thru x5 factors, the feature team leader wouldhi t i ll checkhistoricallyh k offff as ““yes”” or ““no”” ddependingdi on whetherh ththey felt that the specific risk significantly impacted thefeature team cost, schedule or qualityOperational definitions and training were conducted toensure consistencyi tandd repeatabilityt bilit among ththe ffeaturetteam leadsRobert StoddardRusty Young 2008 Carnegie Mellon University84

Development of the Req’ts Phase PPM - 1All of the risks are ratedX1: Volatilityeither 0 or 1, with 0 beingThe Y outcomeshown inthe absence of the riskIs the Numberdecimal formand 1 being the presenceOf Reqtsof the riskDefectsRobert StoddardRusty Young 2008 Carnegie Mellon University85

Development of the Req’ts Phase PPM - 2Robert StoddardRusty Young 2008 Carnegie Mellon University86

Development of the Req’ts Phase PPM - 3Thi willThisill accomplishli hDummy VariableRegression to handleX factors that areContinuous andDiscreteRobert StoddardRusty Young 2008 Carnegie Mellon University87

Development of the Req’ts Phase PPM - 4Robert StoddardRusty Young 2008 Carnegie Mellon University88

Development of the Req’ts Phase PPM - 5Robert StoddardRusty Young 2008 Carnegie Mellon University89

Development of the Req’ts Phase PPM - 6Robert StoddardRusty Young 2008 Carnegie Mellon University90

Intended Use of this Req’ts PPMOnce we decide on the final form of the PPM, we will use itprimaryy ways:yin two p1)At the beginning of each feature team kickoff, the teamwillill anticipateti i t ththe valueslffor ththe x ffactorst( 1 x5).(x15)They will evaluate the PPM at these values to predict thenumber of Req’ts defects. If this prediction isunacceptable, they will take immediate action to addressone or more of the x factors.2)During the development, the feature team willperiodically re-assess the anticipated values of the xfactors and repeatpthe actions of stepp 1 above.Robert StoddardRusty Young 2008 Carnegie Mellon University91

Updating this Req’ts PPMAs more feature teams develop features, they will continueto record the data for the x factors and the resultinggYoutcome of number of Req’ts DefectsWhen a group of feature teams have finished the lifecycleand have recorded their data, the organization may chooseto add their data to the existing data set and then repeat theexercise of developingp g the dummyy variable regressiongequation.Ultimately, the organization may want to segment thefeature teams by type and conduct this analysis for eachsegment.Robert StoddardRusty Young 2008 Carnegie Mellon University92

An Example ProcessPerformance Model(PPM) during theDesign Phase 2008 Carnegie Mellon University

The Situation in DesignThe Design team is faced with modifying legacy software inp g new software.addition to developingA major issue that can have disastrous effects on projects isth ideatheid off “b“brittleness”ittl” off software.ftInI a nutshell,t h ll softwareftbecomes more “brittle” over time as it is changed andexperiences a drifting usage model.OUR NEED: A PPM used by each feature team duringd i tto predictdesigndi t hhow “b“brittle”ittl ” softwareftiis, andd subsequentlybtlto make the correct design decisions regarding degree ofmodification vs rewrite from scratch.Robert StoddardRusty Young 2008 Carnegie Mellon University94

Details of the Software Brittleness PPMThe outcome, Y, is the measure of software brittleness,g ),measured on an arbitraryy scale of 0 ((low)) to 100 ((high),which will be treated as continuous dataThe x factorsThf tusedd iin thithis predictiondi ti examplel are ththefollowing: Unit ppath complexitypyUnit data complexityNumber of times the unit code files have been changedNumber of unit code changes not represented in Design documentupdatesRobert StoddardRusty Young 2008 Carnegie Mellon University95

Background Information on the DataWe collected historical data from feature teams on theircode units. The data,, related to the first four x factors,, aremaintained by the CM system using automated toolstracking this data each time new code file versions arechecked in. Unit path complexity Unit data complexity NumberN b off titimes ththe unitit coded filfiles hhave bbeen changedhdWe also have access to problem reporting and inspectiondatabases which pprovide us with a number of issuesreported against individual code units. Finally, we have“Brittleness” values for each unit of code that were assignedby a different empirical exercise with domain experts.Robert StoddardRusty Young 2008 Carnegie Mellon University96

Development of the Brittleness PPM - 1Robert StoddardRusty Young 2008 Carnegie Mellon University97

Development of the Brittleness PPM - 2Robert StoddardRusty Young 2008 Carnegie Mellon University98

Development of the Brittleness PPM - 3Robert StoddardRusty Young 2008 Carnegie Mellon University99

Development of the Brittleness PPM - 4Robert StoddardRusty Young 2008 Carnegie Mellon University100

Development of the Brittleness PPM - 5Robert StoddardRusty Young 2008 Carnegie Mellon University101

Development of the Brittleness PPM - 6Robert StoddardRusty Young 2008 Carnegie Mellon University102

Intended Use of this Brittleness PPM - 1Once we decide on the final form of the PPM, we will use itprimaryy ways:yin two p1)As each feature team begins software design, the teamwill collect the x factor information for the software unitsto be worked on, and then evaluate the PPM at thesevalues to predict the

JMP is interactive, comprehensive, visual software from SAS. It . Reader Input Input & Output Inquiry Report Sales Report Output Output Store System Inventory Database CRUD Logical File Interface Input & Output Cash Register Input Bar Code Label Maker Output Output Cas

Related Documents:

CMMI-DEV and CMMI-SVC Could we leverage the overlap between CMMI-DEV and CMMI-SVC? CMMI-DEV v1.3 Has a total of 18 Process Areas (PAs) From which 17 PA directly apply to Pasadena Operations The Supplier Agreements Management (SAM) PA is not implemented For Maturity Level 3 12 out of the 18 PA are the same for CMMIDEV and CMMI- -SVCFile Size: 236KB

CMMI-DEV process assets can be reused in adopting CMMI-SVC Substantial overlap between CMMI-SVC process areas and ISO/IEC 20000 processes CMMI-SCV will be supported by SEI Partners (SEI 2007) 226 Partners offer Introduction to CMMI 248 Partners offer SCAMPI appraisal services 54,460 Introduction to CMMI courses since 2000

CMMI-SVC CMMI-DEV & CMMI-SVC CMMI- DEV CMMI-SVC Provides guidance for delivering services within organization or for external customers CMMI-DEV Provides guidance for managing, measuring &am

CMMI Capability Maturity Model Integration CMMI-ACQ CMMI for Acquisition CMMI-DEV CMMI for Development CMMI-SVC CMMI for Services COTS C ommercial off-the-shelf CSCI Computer software configuration ite

In contrast, CMMI is aimed at intellectual work CMMI-DEV (formerly called CMMI -SW/SE) is specific to software and systems . development but for all kinds of software or HW/SW systems There is also a CMMI-SVC for services There is also a CMMI-ACQ for acquisition Like TQM, CMMI also pays attention to human factors

Increasingly, CMMI-DEV and CMMI-SVC are used in the same organization, implementing and appraising together. Choose CMMI-SVC as your base model, grab the engineering PAs for particular services. Treat development or engineering as a service, managed using the practices of CMMI-SVC, and

CMMI-DEV CMMI-ACQ CMMI-SVC 77% Service Modifications: 21 amplification in 7 PAs 5 added references 1 modified PA (REQM) 1 specific goal 2 specific practices CMMI for Services Constellation

CMMI for Services How It Differs from CMMIHow It Differs from CMMI----DEV and How to Apply It in Different DEV and How to Apply It in Different Environments 1 Source: SEI’s CMMI for Services (CMMI-SVC) Overview Presentation Sept. 2008 CMMI is registered with the U.S. P