DO-178C: A New Standard For Software Safety Certification

2y ago
143 Views
2 Downloads
458.68 KB
43 Pages
Last View : 2d ago
Last Download : 3m ago
Upload by : Carlos Cepeda
Transcription

Presentation coverpage EUDO-178C: A New Standard forSoftware Safety CertificationNorth American Headquarters:104 Fifth Avenue, 15th FloorNew York, NY 10011USA 1-212-620-7300 (voice) 1-212-807-0162 (FAX)SSTC 2010Salt Lake City, UtahTrack 1Monday, 26 April 20103:30 – 4:15 pmEuropean Headquarters:46 rue dd’AmsterdamAmsterdam75009 ParisFrance 33-1-4970-6716 (voice) 33-1-4970-0552 (FAX)www.adacore.comBen Brosgol y brosgol@adacore.comCyrille Comar y comar@adacore.com

Form ApprovedOMB No. 0704-0188Report Documentation PagePublic reporting burden for the collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data sources, gathering andmaintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information,including suggestions for reducing this burden, to Washington Headquarters Services, Directorate for Information Operations and Reports, 1215 Jefferson Davis Highway, Suite 1204, ArlingtonVA 22202-4302. Respondents should be aware that notwithstanding any other provision of law, no person shall be subject to a penalty for failing to comply with a collection of information if itdoes not display a currently valid OMB control number.1. REPORT DATE3. DATES COVERED2. REPORT TYPE26 APR 201000-00-2010 to 00-00-20104. TITLE AND SUBTITLE5a. CONTRACT NUMBERDO-178C: A New Standard for Software Safety Certification5b. GRANT NUMBER5c. PROGRAM ELEMENT NUMBER6. AUTHOR(S)5d. PROJECT NUMBER5e. TASK NUMBER5f. WORK UNIT NUMBER7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES)AdaCore,North American Headquarters,104 Fifth Avenue, 15thFloor,New York,NY,100119. SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS(ES)8. PERFORMING ORGANIZATIONREPORT NUMBER10. SPONSOR/MONITOR’S ACRONYM(S)11. SPONSOR/MONITOR’S REPORTNUMBER(S)12. DISTRIBUTION/AVAILABILITY STATEMENTApproved for public release; distribution unlimited13. SUPPLEMENTARY NOTESPresented at the 22nd Systems and Software Technology Conference (SSTC), 26-29 April 2010, Salt LakeCity, UT. Sponsored in part by the USAF. U.S. Government or Federal Rights License14. ABSTRACT15. SUBJECT TERMS16. SECURITY CLASSIFICATION OF:a. REPORTb. ABSTRACTc. THIS PAGEunclassifiedunclassifiedunclassified17. LIMITATION OFABSTRACT18. NUMBEROF PAGESSame asReport (SAR)4219a. NAME OFRESPONSIBLE PERSONStandard Form 298 (Rev. 8-98)Prescribed by ANSI Std Z39-18

OutlineDO-178B Summary Levels Life-Cycle Model Objectives Role of Testing Related documentsDO-178C Organization of revision effort Terms of Reference / rationale for approach Changes to Core Document Technology Supplements* Tool Qualification Model-Based Design and Verification Object-Oriented Technology FormalFl MethodsM th d* Based on information available in February 20101

Safety-Critical Software: BackgroundWhat is “safety critical” software? Failure can cause loss of human life or have other catastrophic consequencesHow does safety criticality affect software development? Regulatory agencies require compliance with certification requirements Safety-related standards may apply to finished product, development process, or bothPrescriptive Specify requirements on the process by which software is developed and fielded Sound process adds confidence in soundness of result Example: DO-178BGoal basedGoal-based Developer provides safety cases Claims concerning system’s safety-relevant attributes Arguments justifying those claims Evidence backing up the arguments Example: UK Defense Standard 00-56 “A Safety Case is a structured argument, supported by a body of evidence, thatprovides a compelling, comprehensible and valid case that a system is safe for a givenapplication in a given environment”2

Perspectives on DO-178B’s Process-Based ApproachQuote from Gérard Ladier (Airbus), FISA-2003 conference “It is not feasible to assess the number or kinds of software errors, if any, that may remainafter the completion of system design, development, and test” “Since dependability cannot be guaranteed from anassessment off theh softwarefproduct,diit iis necessaryto have assurance on its development process” “You can’t deliver clean water in a dirty pipe”Quote from John Rushby,y HCSS Aviation Safetyy Workshop,p Oct 2006 “Because we cannot demonstrate how wellwe’ve done, we’ll show how hard we’ve tried”3

DO-178B BasicsSoftware Considerations in Airborne Systems and Equipment Certification,December 1992, published by RTCA* EUROCAE** / ED-12B in EuropeComprises a set of 66 “objectives” (“guidelines”) for production of softwarefor airborne systems Reliability: System does what it is supposed to do Ö no failuresqto its implementingpg code and verification Can trace each requirement No missing functionality Safety: System does not do what it is not supposed to do Ö no hazards Can trace each piece of code back to a requirement No additional functionality, no “dead code” Requires appropriate configuration management, quality assurance“Level”Level of software establishes which objectives apply* RTCA (www.rtca.org) is a U.S. Federal Advisory Committee whose recommendations guide FAA policy** European Organisation for Civil Aviation Equipment (www.eurocae.org)4

Software Levels Based on System Safety AssessmentLevel A Anomalous behavior Ö catastrophic failure condition“SafetyCritical” “prevent continued safe flight and landing”Levell B Anomalous behavior Ö hazardous / severe-major failure condition “serious or potentially fatal injuries to a small number of occupants”Level C Anomalous behavior Ö major failure conditionpppossiblyy includingg injuries”j “discomfort to occupants,Level D Anomalous behavior Ö minor failure condition “somesome inconvenience to occupants”occupantsLevel ENot addressedin DO-178B Anomalous behavior Ö no effect on aircraft operational capabilityor pilot workload5

Structure / Goals / UsageDO-178B guidelines organized into three major categories, each with aspecified set of output artifacts Software Planning Process Software Development Processes “Integral” ProcessesAppears oriented around new development efforts But mayy be appliedppto previouslypy developedp software,, COTS,, etc.Strong emphasis on traceabilityImplies traditional / static program build model Compile,Compile linklink, eexecuteec teUsed by FAA to approve software for commercial aircraft Developer organization supplies certification material Designated Engineering Representative (“DER”) evaluates for compliance with DO-178B“In a nutshell, what does this DO-178B specification really do?”* “It specifies that every line of code be directly traceable to a requirement and a test routine,and that no extraneous code outside of this process be included in the build”** Esterel Technologies, DO-178B FAQs, www.esterel-technologies.com/do-178b/6

Other Aspects of DO-178BNot specific to aviation Could be applied, in principle, to other domains (medical devices, etc.)Includes glossary of terms “dead code”, “deactivated code”, “verification”, .Does not dictate Particular development process, design approach or notationanalysis, etc) Particular approach to hazard assessment (fault tree analysis Specific programming language(s) or software tools Requirements for personnel training Format for artifactsTool qualification Ensures that tool provides confidence at least equivalent to that of the process(es)eliminated,li i t d reduceddd or automatedtt d Can qualify as a verification tool (bug may fail to detect errors but won’t introduce any) oras a development tool (bug may introduce errors)Wh t aboutWhatb t security?it ? No specific security-related objectives in DO-178B Work in progress under RTCA (SC-216) and EUROCAE (WG-72)7

DO-178B Software Life Cycle ModelPlan forSoftware Aspects ofCertificationSoftwarePlanningProcessSoftware QA PlanSoftware Config Mgmt PlanSoftware Verification PlanSoftware Development PlanConcurrentActivitiesSoftware DevelopmentProcesses Requirements Design Coding IntegrationHigh-Level RequirementsDerived RequirementsDesign DescriptionLow-Level RequirementsDerived RequirementsIntegral Processes Software Verification Software Configuration Mgmt Software Quality Assurance Certification LiaisonSource CodeObject CodeExecutable Object Code Link / Load Data8

Summary of DO-178B “Objectives”Safety LevelProcessABCDSoftware Planning Process7772Software Development Process7777Verification of Outputs of Software Requirements Process3(ind) 43(ind) 463Verification of Outputs of Software Design Process6(ind) 73(ind) 1091Verification of Outputs of Software Coding & IntegrationProcesses3(ind) 41(ind) 660Testing of Outputs of Integration Processes2(ind) 31(ind) 4538(ind)3(ind) ation of Verification Process ResultsSoftware Configuration Management ProcessSoftware Quality Assurance ProcessCertification Liaison ProcessTotalsTable shows number of objectives per process category“ind” Ö need to show that objective is satisfied “with independence”9

Sample DO-178B Objectives [Table A-5]Verification of OutputspofSoftware Coding and Integration ProcessesObjectiveLevelSource Code complies with low-levelrequirementsABCSource Code complies with softwarearchitectureABCSource Code is verifiableABSource Code conforms to standardsABCSource Code is traceable to low-levelrequirementsABCSource Code is accurate and consistentABCOutput of software integration process iscomplete and correctABCOutputSoftware Verification ResultsUnderlining of level Ö “objective should be satisfied with independence”10

Some Issues related to Table A-5 Objectives [§6.3.4]Reviews and Analyses of the Source CodeConformance to standards Complexity restrictions Code constraints consistent with system safety objectivesAccuracy and consistency Stack usageFixed-pointpoint arithmetic overflow and resolution Fixed Resource contention Worst-case execution timing Exception handling Use of uninitialized variables or constants Unused variables or constants DataD t corruptionti ddue tto ttaskk or iinterrupttt conflictsfli t11

Sample DO-178B Objectives [Table A-7]Verification of Verification Process ResultsObjectiveLevelTest procedures are correctABCTest results are correct and discrepancies explainedABCTest coverage of high-level requirements is achievedABCDTest coverage of lowlow-levellevel requirements is achievedABCTest coverage of software structure (modifiedcondition/decision coverage) is achievedATest coverage of software structure (decision coverage) isachievedABTest coverage of software structure (statement coverage) isachievedABCTest coverage of software structure (data coupling andcontrol coupling) is achievedABCOutputSoftware VerificationCases and ProceduresSoftwareSftVVerificationifi tiResultsUnderlining of level Ö “objective should be satisfied with independence”12

Role of Testing in Software VerificationTest cases are to be derived from software requirements Requirements-based hardware/software integration testing Requirements-based software integration testing Requirements-based low-level testingTest cases must fully cover the code Unexercised code may be due to any of several reasonsg requirementqÖ Add new requirementq Mi

Apr 26, 2010 · DO-178B Software Life Cycle Model Software QA Plan Software Planning Process Plan for Software Aspects of Certification Software Development Plan . Role of Testing in Software Verification Test cases are to be derived from software requirements Requirements-based hardware/

Related Documents:

DO-178C overview continued supplements that may be used in conjunction with the DO-178C. These supplements are used to avoid the need to update or expand the text inside the main DO-178C document. For example, the software tool qualification has been deleted in the main DO-178C and has been replaced with Section DO-330. In

DO-178C. Some parts of the system are developed to design assurance level (DAL) B and other parts to DAL D. In many cases, the validation and verification requirements include rigorous testing and measurement of code coverage achieved during testing. DO-178C requires a suitable level of coverage. Recording test results and coverage are important

DO-178C objectives A7-5, 6, and 7 relate to the achievement of 100% MC/DC, decision, and statement coverage respectively. The higher the DAL, the more demanding the structural coverage objectives. For Level A systems, structural coverage at the source level isn't enough. Compilers often add additional code or alter control flow, and often .

subgroup during the DO-178C/ED-12C project. He was also a member of the EUROCAE/RTCA group that produced DO-248B/ED-94B, which provides supporting information for DO-178B/ED-12B. Mr. Pothon is based in Montpellier, France. Quentin Ochem Quentin Ochem is the Lead of Business Development and Technical

DO - 178C / ED - 12C Model Based Supplement Author: pierre.lionne Subject: Adopting Model-Based Design within Aerospace and Defense Symposium \(Nov 2011\) Pierre Lionne, APSYS Created Date: 5/10/2016 9:13:43 AM

Tabla 1: DAL definidos en RTCA DO-178C La Tabla 1, muestra la clasificación del estándar RTCA DO-178C de los niveles denominados DAL (por sus siglas en inglés, Design Assurance Level) para el diseño, de acuerdo al impacto que tiene la condición de fallo en la reducción de la capacidad de un vuelo seguro.

T A B L E O F C O N T E N T S Description Page No. Criterion 1: Program Mission, Objectives and Outcomes Standard 1.1.1 Standard 1.1.2 (a&b) Standard 1.1.3 Standard 1.1.4 Standard 1.2 Standard 1.3 Standard 1.4 Criterion 2: Curriculum Design and Organization Standard 2.1 Standard 2.2 Standard 2.3 Standard 2.4 Standard 2.5 Standard 2.6

Paper Tray (250 sheets) Standard Standard Standard Manual Feeder Slot (1 sheet) Standard Standard Standard Output Tray (120 sheets) Standard Standard Standard AirPrint Standard Standard Not Applicable Google Cloud Print Standard Standard Not Applicable Network Printing S