Reference : µBert: Mutation Testing using Pre-Trained Language Models
Scientific congresses, symposiums and conference proceedings : Paper published in a book
Engineering, computing & technology : Computer science
Computational Sciences
http://hdl.handle.net/10993/51744
µBert: Mutation Testing using Pre-Trained Language Models
English
Degiovanni, Renzo Gaston mailto [University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT) > SerVal]
Papadakis, Mike mailto [University of Luxembourg > Faculty of Science, Technology and Medicine (FSTM) > Department of Computer Science (DCS)]
2022
µBert: Mutation Testing using Pre-Trained Language Models
Degiovanni, Renzo Gaston mailto
Papadakis, Mike mailto
IEEE
160--169
Yes
International
15th {IEEE} International Conference on Software Testing, Verification and Validation Workshops {ICST} Workshops 2022
April 4-13, 2022
[en] Mutation Testing ; Pre-Trained Language Models ; CodeBERT
[en] We introduce µBert, a mutation testing tool that uses a pre-trained language model (CodeBERT) to generate mutants. This is done by masking a token from the expression given as input and using CodeBERT to predict it. Thus, the mutants are generated by replacing the masked tokens with the predicted ones. We evaluate µBert on 40 real faults from Defects4J and show that it can detect 27 out of the 40 faults, while the baseline (PiTest) detects 26 of them. We also show that µBert can be 2 times more cost-effective than PiTest, when the same number of mutants are analysed.
Additionally, we evaluate the impact of µBert's mutants when used by program assertion inference techniques, and show that they can help in producing better specifications. Finally, we discuss about the quality and naturalness of some interesting mutants produced by µBert during our experimental evaluation.
Interdisciplinary Centre for Security, Reliability and Trust (SnT) > Security Design and Validation Research Group (SerVal)
Fonds National de la Recherche - FnR
INTER/ANR/18/12632675/SATOCROSS
Researchers ; Professionals ; Students ; General public
http://hdl.handle.net/10993/51744
10.1109/ICSTW55395.2022.00039
https://doi.org/10.1109/ICSTW55395.2022.00039

File(s) associated to this reference

Fulltext file(s):

FileCommentaryVersionSizeAccess
Open access
codebert_mutation-5.pdfAuthor postprint746.28 kBView/Open

Bookmark and Share SFX Query

All documents in ORBilu are protected by a user license.