Paper published in a book (Scientific congresses, symposiums and conference proceedings)
µBert: Mutation Testing using Pre-Trained Language Models
Degiovanni, Renzo Gaston; Papadakis, Mike
2022In Degiovanni, Renzo Gaston; Papadakis, Mike (Eds.) µBert: Mutation Testing using Pre-Trained Language Models
Peer reviewed
 

Files


Full Text
codebert_mutation-5.pdf
Author postprint (764.19 kB)
Download

All documents in ORBilu are protected by a user license.

Send to



Details



Keywords :
Mutation Testing; Pre-Trained Language Models; CodeBERT
Abstract :
[en] We introduce µBert, a mutation testing tool that uses a pre-trained language model (CodeBERT) to generate mutants. This is done by masking a token from the expression given as input and using CodeBERT to predict it. Thus, the mutants are generated by replacing the masked tokens with the predicted ones. We evaluate µBert on 40 real faults from Defects4J and show that it can detect 27 out of the 40 faults, while the baseline (PiTest) detects 26 of them. We also show that µBert can be 2 times more cost-effective than PiTest, when the same number of mutants are analysed. Additionally, we evaluate the impact of µBert's mutants when used by program assertion inference techniques, and show that they can help in producing better specifications. Finally, we discuss about the quality and naturalness of some interesting mutants produced by µBert during our experimental evaluation.
Research center :
Interdisciplinary Centre for Security, Reliability and Trust (SnT) > Security Design and Validation Research Group (SerVal)
Disciplines :
Computer science
Author, co-author :
Degiovanni, Renzo Gaston ;  University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT) > SerVal
Papadakis, Mike ;  University of Luxembourg > Faculty of Science, Technology and Medicine (FSTM) > Department of Computer Science (DCS)
External co-authors :
no
Language :
English
Title :
µBert: Mutation Testing using Pre-Trained Language Models
Publication date :
2022
Event name :
15th {IEEE} International Conference on Software Testing, Verification and Validation Workshops {ICST} Workshops 2022
Event date :
April 4-13, 2022
Audience :
International
Main work title :
µBert: Mutation Testing using Pre-Trained Language Models
Publisher :
IEEE
Pages :
160--169
Peer reviewed :
Peer reviewed
Focus Area :
Computational Sciences
Name of the research project :
INTER/ANR/18/12632675/SATOCROSS
Funders :
FNR - Fonds National de la Recherche [LU]
Available on ORBilu :
since 21 July 2022

Statistics


Number of views
191 (5 by Unilu)
Number of downloads
75 (5 by Unilu)

Scopus citations®
 
8
Scopus citations®
without self-citations
6

Bibliography


Similar publications



Contact ORBilu