Requirements Quality Assurance; Requirements Completeness; Natural-language Requirements; Domain Modeling; Case Study Research
Résumé :
[en] [Context] Domain modeling is a common strategy for mitigating incompleteness in requirements. While the benefits of domain models for checking the completeness of requirements are anecdotally known, these benefits have never been evaluated systematically. [Objective] We empirically examine the potential usefulness of domain models for detecting incompleteness in natural-language requirements. We focus on requirements written as “shall”- style statements and domain models captured using UML class diagrams. [Methods] Through a randomized simulation process, we analyze the sensitivity of domain models to omissions in requirements. Sensitivity is a measure of whether a domain model contains information that can lead to the discovery of requirements omissions. Our empirical research method is case study research in an industrial setting. [Results and Conclusions] We have experts construct domain models in three distinct industry domains. We then report on how sensitive the resulting models are to simulated omissions in requirements. We observe that domain models exhibit near-linear sensitivity to both unspecified (i.e., missing) and under-specified requirements (i.e., requirements whose details are incomplete). The level of sensitivity is more than four times higher for unspecified requirements than under-specified ones. These results provide empirical evidence that domain models provide useful cues for checking the completeness of natural-language requirements. Further studies remain necessary to ascertain whether analysts are able to effectively exploit these cues for incompleteness detection.
Disciplines :
Sciences informatiques
Auteur, co-auteur :
ARORA, Chetan ; University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT)
SABETZADEH, Mehrdad ; University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT)
BRIAND, Lionel ; University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT)
Co-auteurs externes :
no
Langue du document :
Anglais
Titre :
An Empirical Study on the Potential Usefulness of Domain Models for Completeness Checking of Requirements
Date de publication/diffusion :
juillet 2019
Titre du périodique :
Empirical Software Engineering
ISSN :
1382-3256
eISSN :
1573-7616
Maison d'édition :
Kluwer Academic Publishers, Pays-Bas
Volume/Tome :
24
Fascicule/Saison :
4
Pagination :
2509–2539
Peer reviewed :
Peer reviewed vérifié par ORBi
Focus Area :
Computational Sciences
Projet européen :
H2020 - 694277 - TUNE - Testing the Untestable: Model Testing of Complex Software-Intensive Systems
Projet FnR :
FNR11601446 - Reconciling Natural-language Requirements And Model-based Specification For Effective Development Of Critical Infrastructure Systems, 2017 (01/11/2017-31/10/2019) - Chetan Arora