Paper published in a book (Scientific congresses, symposiums and conference proceedings)
Experimental Evaluation of a Tool for Change Impact Prediction in Requirements Models: Design, Results, and Lessons Learned
Göknil, Arda; van Domburg, Roderick; Kurtev, Ivan et al.
2014In The Fourth International Model-Driven Requirements Engineering (MoDRE) workshop
Peer reviewed
 

Files


Full Text
modre-preprint.pdf
Publisher postprint (343.3 kB)
Download

All documents in ORBilu are protected by a user license.

Send to



Details



Keywords :
Requirements management tools; change impact analysis; requirements models
Abstract :
[en] There are commercial tools like IBM Rational RequisitePro and DOORS that support semi-automatic change impact analysis for requirements. These tools capture the requirements relations and allow tracing the paths they form. In most of these tools, relation types do not say anything about the meaning of the relations except the direction. When a change is introduced to a requirement, the requirements engineer analyzes the impact of the change in related requirements. In case semantic information is missing to determine precisely how requirements are related to each other, the requirements engineer generally has to assume the worst case dependencies based on the available syntactic information only. We developed a tool that uses formal semantics of requirements relations to support change impact analysis and prediction in requirements models. The tool TRIC (Tool for Requirements Inferencing and Consistency checking) works on models that explicitly represent requirements and the relations among them with their formal semantics. In this paper we report on the evaluation of how TRIC improves the quality of change impact predictions. A quasi-experiment is systematically designed and executed to empirically validate the impact of TRIC. We conduct the quasi-experiment with 21 master’s degree students predicting change impact for five change scenarios in a real software requirements specification. The participants are assigned with Microsoft Excel, IBM RequisitePro or TRIC to perform change impact prediction for the change scenarios. It is hypothesized that using TRIC would positively impact the quality of change impact predictions. Two formal hypotheses are developed. As a result of the experiment, we are not able to reject the null hypotheses, and thus we are not able to show experimentally the effectiveness of our tool. In the paper we discuss reasons for the failure to reject the null hypotheses in the experiment.
Disciplines :
Computer science
Author, co-author :
Göknil, Arda ;  University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT)
van Domburg, Roderick
Kurtev, Ivan
van den Berg, Klaas
Wijnhoven, Fons
Language :
English
Title :
Experimental Evaluation of a Tool for Change Impact Prediction in Requirements Models: Design, Results, and Lessons Learned
Publication date :
2014
Event name :
The Fourth International Model-Driven Requirements Engineering (MoDRE) workshop co-located with RE 2014
Event place :
Karlskrona, Sweden
Event date :
25-08-2014
Audience :
International
Main work title :
The Fourth International Model-Driven Requirements Engineering (MoDRE) workshop
Pages :
57-66
Peer reviewed :
Peer reviewed
Available on ORBilu :
since 18 August 2014

Statistics


Number of views
66 (4 by Unilu)
Number of downloads
200 (5 by Unilu)

Scopus citations®
 
6
Scopus citations®
without self-citations
5

Bibliography


Similar publications



Contact ORBilu