Abstract :
[en] INTRODUCTION: The pursuit of automating software test case generation, particularly for unit tests, has become increasingly important due to the labor-intensive nature of manual test generation [6]. However, a significant challenge in this domain is the inability of automated approaches to generate relevant inputs, which compromises the efficacy of the tests [6]. In this study, we address the critical issue of enhancing the quality of automated test case generation.We demonstrate the presence of valuable relevant inputs within bug reports, showcasing their potential for improving software testing. To harness these inputs effectively, we introduce BRMiner, a novel tool designed for the extraction of relevant input values from bug reports. Our approach includes the modification of EvoSuite, a prominent automated test case generation tool, enabling it to incorporate these extracted inputs. Through systematic evaluation using the Defects4J benchmark, we assess the impact of BRMiner inputs on test adequacy and effectiveness, focusing on code coverage and bug detection. This study not only identifies the relevance of bug report inputs but also offers a practical solution for leveraging them to enhance automated test case generation in real-world software projects. In the realm of automated test case generation, methods like Dynamic Symbolic Execution (DSE) [2] and Search-Based Software Testing (SBST) have been prevalent [3]. Despite their strengths, these techniques often struggle with generating contextually appropriate and realistic inputs [6]. This study, therefore, emphasizes the untapped potential of bug reports as a source of such inputs. Bug reports, rich in valid, human-readable inputs, are particularly beneficial for enhancing test coverage and detecting bugs. BRMiner, automates the extraction of relevant test inputs from bug reports, significantly enhancing the efficiency of test case generation. This is achieved by incorporating these inputs into EvoSuite, a leading SBST tool. The study showcases the advantages of integrating a feature in EvoSuite for external inputs, particularly from bug reports, to improve its efficacy in conjunction with DSE. Related research in automatic test case generation provides context to our work. TestMiner [6], unlike BRMiner, extracts literals from existing tests for domain-specific values, and approaches like K-Config [4] and LeRe [7], focusing on compiler testing using bug report information, diverge from our approach. PerfLearner [1], which uses bug reports for extracting execution commands for performance bugs, also differs from BRMiner's focus on bug detection.
Scopus citations®
without self-citations
2