Reference : How effective are mutation testing tools? An empirical analysis of Java mutation test...
Scientific journals : Article
Engineering, computing & technology : Computer science
http://hdl.handle.net/10993/35336
How effective are mutation testing tools? An empirical analysis of Java mutation testing tools with manual analysis and real faults
English
Kintis, Marinos mailto [University of Luxembourg > Interdisciplinary Centre for Security, Reliability and Trust (SNT) > >]
Papadakis, Mike mailto [University of Luxembourg > Faculty of Science, Technology and Communication (FSTC) > Computer Science and Communications Research Unit (CSC) >]
Papadopoulos, Andreas [Athens University of Economics and Business > Department of Informatics]
Valvis, Evangelos [Athens University of Economics and Business > Department of Informatics]
Malevris, Nicos mailto [Athens University of Economics and Business > Department of Informatics]
Le Traon, Yves mailto [University of Luxembourg > Faculty of Science, Technology and Communication (FSTC) > Computer Science and Communications Research Unit (CSC) >]
In press
Empirical Software Engineering
Springer Science & Business Media B.V.
Yes (verified by ORBilu)
International
1382-3256
1573-7616
[en] Mutation testing ; Fault detection ; Tool comparison ; Human study ; Real faults
[en] Mutation analysis is a well-studied, fault-based testing technique. It requires
testers to design tests based on a set of artificial defects. The defects help
in performing testing activities by measuring the ratio that is revealed by the
candidate tests. Unfortunately, applying mutation to real-world programs
requires automated tools due to the vast number of defects involved. In such a
case, the effectiveness of the method strongly depends on the peculiarities of
the employed tools. Thus, when using automated tools, their implementation
inadequacies can lead to inaccurate results. To deal with this issue, we
cross-evaluate four mutation testing tools for Java, namely PIT, muJava, Major
and the research version of PIT, PITRV, with respect to their fault-detection
capabilities. We investigate the strengths of the tools based on: a) a set of
real faults and b) manual analysis of the mutants they introduce. We find that
there are large differences between the tools’ effectiveness and demonstrate
that no tool is able to subsume the others. We also provide results indicating
the application cost of the method. Overall, we find that PITRV achieves the
best results. In particular, PITRV outperforms the other tools by finding 6%
more faults than the other tools combined.
http://hdl.handle.net/10993/35336

File(s) associated to this reference

Fulltext file(s):

FileCommentaryVersionSizeAccess
Limited access
Kintis_EMSE_2017.pdfPublisher postprint6.04 MBRequest a copy

Bookmark and Share SFX Query

All documents in ORBilu are protected by a user license.