Article (Périodiques scientifiques)
Test Generation and Test Prioritization for Simulink Models with Dynamic Behavior
Matinnejad, Reza; NEJATI, Shiva; BRIAND, Lionel et al.
2019In IEEE Transactions on Software Engineering, 45 (9), p. 919-944
Peer reviewed
 

Documents


Texte intégral
paper.pdf
Postprint Auteur (2.31 MB)
Télécharger

Tous les documents dans ORBilu sont protégés par une licence d'utilisation.

Envoyer vers



Détails



Mots-clés :
Simulink models; search-based software testing; test generation; test prioritization; test oracle; output diversity; signal features; structural coverage
Résumé :
[en] All engineering disciplines are founded and rely on models, although they may differ on purposes and usages of modeling. Among the different disciplines, the engineering of Cyber Physical Systems (CPSs) particularly relies on models with dynamic behaviors (i.e., models that exhibit time-varying changes). The Simulink modeling platform greatly appeals to CPS engineers since it captures dynamic behavior models. It further provides seamless support for two indispensable engineering activities: (1) automated verification of abstract system models via model simulation, and (2) automated generation of system implementation via code generation. We identify three main challenges in the verification and testing of Simulink models with dynamic behavior, namely incompatibility, oracle and scalability challenges. We propose a Simulink testing approach that attempts to address these challenges. Specifically, we propose a black-box test generation approach, implemented based on meta-heuristic search, that aims to maximize diversity in test output signals generated by Simulink models. We argue that in the CPS domain test oracles are likely to be manual and therefore the main cost driver of testing. In order to lower the cost of manual test oracles, we propose a test prioritization algorithm to automatically rank test cases generated by our test generation algorithm according to their likelihood to reveal a fault. Engineers can then select, according to their test budget, a subset of the most highly ranked test cases. To demonstrate scalability, we evaluate our testing approach using industrial Simulink models. Our evaluation shows that our test generation and test prioritization approaches outperform baseline techniques that rely on random testing and structural coverage.
Disciplines :
Sciences informatiques
Auteur, co-auteur :
Matinnejad, Reza;  University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT)
NEJATI, Shiva ;  University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT)
BRIAND, Lionel ;  University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT)
Bruckmann, Thomas;  Delphi Automotive Systems, Luxembourg
Co-auteurs externes :
yes
Langue du document :
Anglais
Titre :
Test Generation and Test Prioritization for Simulink Models with Dynamic Behavior
Date de publication/diffusion :
septembre 2019
Titre du périodique :
IEEE Transactions on Software Engineering
ISSN :
0098-5589
Maison d'édition :
Institute of Electrical and Electronics Engineers, New York, Etats-Unis - New York
Volume/Tome :
45
Fascicule/Saison :
9
Pagination :
919-944
Peer reviewed :
Peer reviewed
Focus Area :
Computational Sciences
Projet européen :
H2020 - 694277 - TUNE - Testing the Untestable: Model Testing of Complex Software-Intensive Systems
Organisme subsidiant :
CE - Commission Européenne
Disponible sur ORBilu :
depuis le 25 février 2018

Statistiques


Nombre de vues
447 (dont 101 Unilu)
Nombre de téléchargements
1852 (dont 56 Unilu)

citations Scopus®
 
62
citations Scopus®
sans auto-citations
55
citations OpenAlex
 
81
citations WoS
 
58

Bibliographie


Publications similaires



Contacter ORBilu