Automated Quality Assurance Of Non-Functional Requirements .

3y ago
48 Views
2 Downloads
1.96 MB
119 Pages
Last View : 3m ago
Last Download : 3m ago
Upload by : Brady Himes
Transcription

AUTOMATED QUALITY ASSURANCE OFNON-FUNCTIONAL REQUIREMENTS FORTESTABILITYA BDERAHMAN R ASHWANATHESISINT HE D EPAR TMENTOFC OMPUTER S CIENCEP RESENTEDF ORTHEINANDS OFTWARE E NGINEERINGP AR TIAL F ULFILLMENTD EGREEOFM ASTEROFOF THER EQUIREMENTSA PPLIED S CIENCES OFTWARE E NGINEERINGC ONCORDIA U NIVERSITYM ONTR ÉAL , Q U ÉBEC , C ANADAA PRIL 2015c A BDERAHMAN R ASHWAN , 2015IN

C ONCORDIA U NIVERSITYSchool of Graduate StudiesThis is to certify that the thesis preparedBy:Abderahman RashwanEntitled:Automated Quality Assurance of Non-Functional Requirements for Testabilityand submitted in partial fulfillment of the requirements for the degree ofMaster of Applied Science inSoftware Engineeringcomplies with the regulations of this University and meets the acceptedstandards with respect to originality and quality.Signed by the final examining committee:ChairDr. Emad ShihabExaminerDr. Leila KosseimExaminerDr. Nikolaos TsantalisSupervisorDr. René WitteSupervisorDr. Olga OrmandjievaApprovedChair of Department or Graduate Program Director20Dr. Amir Asif, DeanFaculty of Engineering and Computer Science

AbstractAutomated Quality Assurance ofNon-Functional Requirements for TestabilityAbderahman RashwanA Software Requirements Specification (SRS) document contains all the requirements to describe a software system to be developed. These requirements aretypically separated into Functional Requirements (FRs), which describe the features of the system under development and Non-Functional Requirements (NFRs),which include quality attributes and design constraints, among others. NFRs canhave a significant impact on the time of a system’s development process and itstotal cost, as they frequently describe cross-cutting concerns. NFRs that are nottestable are typically ignored in system development, as there is no way to verifythem. Thus, NFRs must be checked for testability. However, for natural languagerequirements, this so far had to be done manually, which is time-consuming andtherefore costly.In order to improve software development support, we propose a semanticquality assurance method that automatically detects non-testable NFRs in natural language specifications. Our work contains four significant contributionstowards this goal: (1) building a generic ontology which represents the main concepts in requirements statements and their relations; (2) Based on this genericontology, two corpora are developed: The first one is a new gold standard corpuscontaining annotations for different NFR types. The second one is for requirements thematic roles and testability; (3) A Support Vector Machine (SVM) classifier to automatically categorize requirements sentences into the different ontologyclasses is introduced; (4) Finally, a rule-based text mining system is used to analyze requirement thematic roles and to flag non-testable NFRs. Based on theSRS corpus, our results demonstrate that the proposed approach is feasible andeffective, with an F-measure of 80% for non-testability detection.iii

AcknowledgmentsI pay my sincere gratitude to all the people who made this thesis possible. Much of my appreciations go to my supervisors, Dr. René Witte, andDr. Olga Ormandjieva, for their continuous guidance and support.Many thanks to the members of the Semantic Software Lab for theirtimely suggestions, including Nona Naderi, Elian Angius, and Bahar Sateli.Rolan Abdukalykov, Olga Ormandjieva, Ishrar Hussain, Mohamad Kassab,and Zakaria Siddiqui are acknowledged for annotating the corpus. I alsowould like to thank Matthew Smith for managing the manual annotationprocess.On a personal note, I would like to convey my thanks to my parents,and my wife for their inspirations and encouragements to complete thistask.iv

Table of ContentsList of FiguresviiiList of TablesxList of Acronymsxii1 Introduction11.1 Motivation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .11.2 Problem Statement3. . . . . . . . . . . . . . . . . . . . . . . . .1.3 Research Goals and Objectives1.4 Outline. . . . . . . . . . . . . . . . . .4. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .62 Background82.1 Software Engineering Concepts . . . . . . . . . . . . . . . . . .82.1.1 Requirements Engineering . . . . . . . . . . . . . . . . .82.1.2 Software Requirement Specifications . . . . . . . . . . .92.1.3 Non-Functional Requirements . . . . . . . . . . . . . . .92.2 Requirements Quality Assurance . . . . . . . . . . . . . . . . . 112.3 Semantic Computing Concepts . . . . . . . . . . . . . . . . . . 132.3.1 Knowledge Representation using Ontologies . . . . . . . 132.3.2 Natural Language Processing . . . . . . . . . . . . . . . . 142.3.3 Machine Learning . . . . . . . . . . . . . . . . . . . . . . 162.4 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 193 Literature Review203.1 NLP-based Requirements Engineering . . . . . . . . . . . . . . 203.2 NFR Classification . . . . . . . . . . . . . . . . . . . . . . . . . . 21v

3.3 Requirement Quality Assurance . . . . . . . . . . . . . . . . . . 233.4 Semantic Analysis of RE Statements . . . . . . . . . . . . . . . 263.5 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 274 System Design304.1 Methodology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 304.1.1 Ontology Building Phase . . . . . . . . . . . . . . . . . . 324.1.2 Corpus Annotation Phase . . . . . . . . . . . . . . . . . . 324.1.3 NFR Classification Phase . . . . . . . . . . . . . . . . . . 334.1.4 Requirements Thematic Roles Extraction Phase . . . . . 334.1.5 Non-Testability Detection Phase . . . . . . . . . . . . . . 334.2 System Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . 344.3 Data Layer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 354.3.1 The NFRs View . . . . . . . . . . . . . . . . . . . . . . . . 364.3.2 Thematic Roles View . . . . . . . . . . . . . . . . . . . . . 364.3.3 Fit-Criteria View . . . . . . . . . . . . . . . . . . . . . . . 364.4 NFR Preprocessing Layer . . . . . . . . . . . . . . . . . . . . . . 394.4.1 Automatic Classification of Requirements . . . . . . . . 394.4.2 Thematic Roles Extractor . . . . . . . . . . . . . . . . . . 424.5 Quality Assurance Layer . . . . . . . . . . . . . . . . . . . . . . 444.6 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 465 Implementation485.1 Implementation Tools . . . . . . . . . . . . . . . . . . . . . . . . 485.1.1 GATE . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 485.1.2 Protégé . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 505.2 System Implementation . . . . . . . . . . . . . . . . . . . . . . . 515.2.1 NFR Ontology . . . . . . . . . . . . . . . . . . . . . . . . . 515.2.2 NFR Classifier. . . . . . . . . . . . . . . . . . . . . . . . 525.2.3 Ontology Population . . . . . . . . . . . . . . . . . . . . . 545.2.4 Requirement Analysis ReqAnalysis . . . . . . . . . . . . 555.3 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59vi

6 Corpora and Evaluation606.1 NFR Corpora . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 606.1.1 Enhanced PROMISE Corpus . . . . . . . . . . . . . . . . 606.1.2 SRS Concordia Corpus . . . . . . . . . . . . . . . . . . . 656.2 System Evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . 696.2.1 NFR Classifier. . . . . . . . . . . . . . . . . . . . . . . . 706.2.2 Thematic Roles Extractor . . . . . . . . . . . . . . . . . . 726.2.3 Non-Testability Detector. . . . . . . . . . . . . . . . . . 736.3 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 757 Conclusions and Future Work76Bibliography77A NFR Classifier Configuration88B Gazetteer Lists90B.1 Modality . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 90B.2 Quantification . . . . . . . . . . . . . . . . . . . . . . . . . . . . 92B.3 Condition and Limit . . . . . . . . . . . . . . . . . . . . . . . . . 92C JAPE Rules for Requirement Thematic Roles Extractor95C.1 Modality . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 95C.2 Agent. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 95C.3 Action . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 97C.4 Theme . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 98C.5 Fit-Criteria . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 98C.6 Condition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 100C.7 Instrument . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 101C.8 Goal . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 102C.9 Non-Testability Detection . . . . . . . . . . . . . . . . . . . . . . 103D OwlExporter104vii

List of Figures1Defects Map . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 112An Example [CV95] of a Separable Problem in a Two-dimensionalSpace. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 193Phases of our Methodology . . . . . . . . . . . . . . . . . . . . . 314High-Level System Design . . . . . . . . . . . . . . . . . . . . . 345Requirements Ontology (excerpt)6RE Ontology (Thematic Role View) . . . . . . . . . . . . . . . . . 377RE Ontology (NFR-Fit Criteria View) . . . . . . . . . . . . . . . . 398NFR Classifier Design . . . . . . . . . . . . . . . . . . . . . . . . 419Text Mining System Design for Analyzing NL Requirements. . . . . . . . . . . . . . . . . 37Statements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4210Auxiliary Verbs Structure . . . . . . . . . . . . . . . . . . . . . . 4311Thematic Role Output Example . . . . . . . . . . . . . . . . . . 4312Non-Testability Detector Example . . . . . . . . . . . . . . . . . 4513GATE Architecture Overview [ea11] . . . . . . . . . . . . . . . . 4914NFR Ontology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5115NFR classifier pipeline . . . . . . . . . . . . . . . . . . . . . . . . 5216Batch Learning PR . . . . . . . . . . . . . . . . . . . . . . . . . . 5317NFR Classifier Output Annotations . . . . . . . . . . . . . . . . 5418Individuals Populated into the Ontology for Security NFR using OwlExporter . . . . . . . . . . . . . . . . . . . . . . . . . . . 5519SPARQL Query for all Security NFR Sentences in the Ontologyusing Protégé . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5520Requirement Thematic Roles Extractor Pipeline . . . . . . . . . 5621Thematic Roles Rules Example22Thematic Roles, Fit-Criteria, and Non-Testability Annotationsviii. . . . . . . . . . . . . . . . . . 5858

23Manual Annotation Process Example for the Enhanced PROMISECorpus . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6224Examples of Different Types of Syntactic Forms Present inthe Enhanced PROMISE Corpus . . . . . . . . . . . . . . . . . . 6525Manual Annotation Process Example for SRS Concordia Corpus 67ix

List of Tables1Examples for Testable and Non-Testable NFRs . . . . . . . . .52NFR Definitions3Requirements Defects Definitions [van09] . . . . . . . . . . . . 124Thematic Roles [JM09] . . . . . . . . . . . . . . . . . . . . . . . 175Thematic Roles in SRS Documents . . . . . . . . . . . . . . . . 386Fit-Criteria Concepts Description . . . . . . . . . . . . . . . . . 407Analysis of the Fit-Criteria on the Enhanced PROMISE Corpus 408NFR Classifier Example Input Sentences . . . . . . . . . . . . . 419Patterns for Detecting the Different Thematic Roles in the Re-. . . . . . . . . . . . . . . . . . . . . . . . . . . 10quirements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4410Patterns of Different Types of Fit-criteria . . . . . . . . . . . . . 4511System Requirements vs. Design . . . . . . . . . . . . . . . . . 4712NFR Classes within the Enhanced PROMISE Corpus . . . . . . 6113Corpus Patterns Statistics of the Enhanced PROMISE Corpus. 6314SRS Concordia Corpus: SRS Documents and their Source forthe SRS Concordia Corpus . . . . . . . . . . . . . . . . . . . . . 6615Numbers of Annotation Classes Sentences per each Document 6816Cohen’s Kappa between each Pair of Annotators . . . . . . . . 6917SVM Results Compared to other Machine Learning Algorithms 7118Results for the SVM Classifiers on the SRS Concordia Corpus 7219Confusion Matrices . . . . . . . . . . . . . . . . . . . . . . . . . 7320Comparison between SVM and Indicator Classifiers on thePROMISE Corpus . . . . . . . . . . . . . . . . . . . . . . . . . . 7421Thematic Role Evaluation Results . . . . . . . . . . . . . . . . . 7422Evaluation of the Automatic Non-Testability Detector on theEnhanced PROMISE Corpus . . . . . . . . . . . . . . . . . . . . 75x

23Non-Testability Detector: Confusion Matrices . . . . . . . . . . 75xi

List of AcronymsAPI Application Programming InterfaceCREOLE A Collection of REusable Objects for Language EngineeringDT Decision TreeEBNF Extended Backus–Naur FormEM Expectation-MaximizationFR Functional RequirementGATE General Architecture for Text EngineeringGUI Graphical User InterfaceIR Information RetrievalJAPE Java Annotation Patterns EngineKNN K-Nearest NeighborLR Language ResourceML Machine LearningNFR Non-Functional RequirementNLP Natural Language ProcessingNN Neural NetworkNR Non RequirementOWL Web Ontology LanguagePAUM Perceptron Algorithm with Uneven MarginsPOS Part-of-speechxii

PR Processing ResourcesQA Quality AssuranceSBVR Semantic of Business Vocabulary and RulesSDLC Software Development Life-CycleSQWRL Semantic Query-Enhanced Web Rule LanguageSRS Software Requirement SpecificationSVM Support Vector MachineTF-IDF Term Frequency - Inverse Document FrequencyUML Unified Modeling LanguageXML Extensible Markup Languagexiii

Chapter 1IntroductionIf you can’t measure a requirement, it isnot really a requirement.Suzanne RobertsonThis thesis is concerned with the development of an automatic qualityassurance system, focused on providing confidence that non-functional requirements (NFRs) can be fulfilled. The goal of our approach is to ensurethe testability of NFRs written in a Software Requirements Specifications(SRS) document. The application of this work is to ensure high quality ofthe NFRs and thereby improve effectiveness for the consequent testing ofthese NFRs. We propose a domain-independent quality assurance framework that extracts the different types of NFRs from requirements text, analyzing their main thematic roles, in order to use it in the quality assuranceprocess.1.1MotivationWhen an initial set of requirements has been elicited and evaluated, it canbe captured in a requirements document. Natural language requirementsspecifications are the most commonly used form (as opposed to formalmodels, based on a logical framework), accounting for up to 90% of all specifications [MFI04]. However, natural language specifications are prone toa number of errors and flaws, in particular due to the ambiguity inherent1

in natural language. Moreover, there is a lack of available methods andtools that aid software engineers in managing textual requirements. Asthe requirements are written in informal natural language, they cannot beeasily analyzed for defects. Our approach to overcome these challenges isbased on natural language processing (NLP), machine learning techniques,and ontologies.Recent studies show that designers and developers often focus more onthe behaviour of a system (i.e., functional requirements (FRs)) and underestimate the cost and the time of the NFRs [Kas09]. This can lead to costand time overruns, and ultimately to project failures. Hence, the detectionand classification of NFRs has become more important in RequirementEngineering (RE), and is therefore the goal of the work described here.Most of the terms and concepts in use for describing NFRs have beenloosely defined and often there is no commonly accepted term for a general concept [Gli07]. In [CNYM00], the authors present a decompositionand operationalization of NFRs into types and managing them by refining and inter-relating NFRs, justifying decisions, and determining theirimpact, elaborated in the NFR framework [CNYM00]. Decomposition refines NFRs into more detailed NFRs. For instance, performance can bedecomposed into response time and throughput; while operationalizationresults in strategies for achieving the NFRs, such as prototyping for a usability NFR. Another NFR decomposition operationalization technique isclassification, e.g., as provided by the ISO/IEC 25010 international standard [ISO10].NFR refinement is often enhanced with domain-specificknowledge, as in [JKCW08], where the authors introduce knowledge andrules provided by a domain ontology to induce non-functional requirements in specified domains. Al Balushi and Dabhi [ABSDL07] also usean ontology-based approach to requirements elicitation, aimed at empowering requirements analysts with a knowledge repository that helps in theprocess of capturing precise non-functional requirements during elicitation interviews. The approach is based on the application of functionaland non-functional domain ontologies (quality ontologies) to underpin theelicitation activities. In contrast, our work aims at providing a more generic2

solution to all types of NFRs, independent from any context.NFRs that are not testable are typically ignored in system development,as there is no way to verify them. Thus, NFRs must be checked for testability. However, for natural language requirements, this so far had to be donemanually, which is time-consuming and therefore costly. We propose a semantic quality assurance method that automatically detects non-testableNFRs in natural language specifications, in order to improve software development support.1.2Problem StatementNFRs represent the borders or the constraints for a software system. Theyare hard to model, as they are stated informally, and it is difficult to measure them, due to their subjective nature.There are requirements artifacts and documents written in natural language, describing the system-to-be within the requirement gathering phase.Requirements are generally categorized into FRs and NFRs. Usually, NFRsreceive less attention than the FRs, and this may lead to project failure,huge budget increases and/or delays for project delivery [Kas09]. So theproblem has many dimensions, requirement statements written in natural language that can be vague and interpreted in different ways. WhenNFRs are not testable or quantifiable, they are likely to be ambiguous,incomplete, or incorrect [PA09]. The following examples illustrate this issue [RR06]:1. “The application shall be user-friendly.”This requirement is vague and non-measurable. A possible re-statedrequirement could be:A new administrator shall be able to add a student, change a student’sdata, and delete a student within 30 minutes of their first attempt atusing the application.2. “The system shall be intuitive.”The word “intuitive” here is not clear and has different meanings. In3

addition, we also do not know for what user group it should be intuitive. A re-phrased requirement can be:The student shall be able to apply for the course within ten minutes ofencountering the application for the first time without reference to anyout-of-application help.3. “The response shall be fast enough.”The concept “fast enough” is not measurable. A modified requirementcan be:The response time shall be no more than 2 seconds for 90 percent ofresponses, and no more than 5 seconds for the remainder.In Table 1, we provide examples for both non-testable and testable typesof NFRs.1.3Research Goals and ObjectivesThe main goal of this work is to provide a quality assurance assessmentframework of NFR using an automated system. We aim to turn unclearrequirements into testable shape, by highlighting all non-testable requirements to the stakeholders, in order to encourage them to improve therequirement. This also makes the system maintainable after the end ofa project, and gives the ability to measure pr

Automated Quality Assurance of Non-Functional Requirements for Testability Abderahman Rashwan A Software Requirements Specification (SRS) document contains all the require-ments to describe a software system to be developed. These requirements are typically separated into Functional Requirements (FRs), which describe the fea- tures of the system under development and Non-Functional .

Related Documents:

critical issues the University has established a Quality Assurance Directorate, which is mandated to develop a Quality Assurance Framework and a Quality Assurance Policy. The Quality Assurance Framework would clearly spell out the Principles, Guidelines and Procedures for implementing institutional quality assurance processes.

Quality Assurance and Improvement Framework Guidance 2 Contents Section 1: Quality Assurance and Improvement Framework 1.1 Overview 1.1.1 Quality Assurance (QA) 1.1.2 Quality Improvement (QI) 1.1.3 Access 1.2 Funding Section 2: Quality Assurance 2.1 General information on indicators 2.1.1 Disease registers 2.1.2 Verification

Software Quality Assurance Plan (SQAP) for the SRR-CWDA-2010-00080 H-Area Tank Farm (HTF) Performance Revision 0 Assessment (PA) Probabilistic Model August 2010 Page 5 of 15 1.0 SCOPE This Software Quality Assurance Plan (SQAP) was developed in accordance with the 1Q Quality Assurance Manual, Quality Assurance Procedure (QAP) 20-1, Rev. 11.

This quality assurance manual specifies the methods to prepare and submit Quality Assurance Process Design Diagram for products and parts to be supplied to NSK by suppliers. 2. Purpose Each supplier should prepare quality assurance process design diagram clearly showing the quality assurance methods used in each products and parts production .

Quality Assurance Representative. The Site Manager will appoint a member of the Project Office to control all Quality Assurance issues including - Assisting the Site Manager to co-ordinate and develop the Quality Assurance Plan. Advise Engineers, General Foremen, Foremen and Chargehands in all matters of Quality Assurance.

Quality assurance or software quality assurance is an integral part of the development process and is used in the IT industry by quality assurance professionals as well as testers. Quality assurance is associated with the concept of dependability. Dependability is, first, a guarantee of increased cybersecurity, reliability and

Quality Assurance and Quality Control (QA/QC) policy. Quality assurance/quality control measures for water treatment utilities refer to a set of activities that are to be undertaken to ensure compliance and above all, ensure that the water is safe for public consumption in a sustainable manner. In general, quality assurance (QA) refers to the

CHAPTER 9: QUALITY ASSURANCE 9.1 QUALITY ASSURANCE/QUALITY CONTROL PROGRAM ELEMENTS As required by DOE Order 5400.1 (1988), General Environmental Protection Program, BNL has established a Quality Assurance/Quality Control (QA/QC) Program to ensure that the accuracy, precision, and re