References of "Pastore, Fabrizio 50002817"
     in
Bookmark and Share    
Full Text
Peer Reviewed
See detailZoomIn: Discovering Failures by Detecting Wrong Assertions
Pastore, Fabrizio UL; Mariani, Leonardo

in Proceedings of the 37th International Conference on Software Engineering (ICSE) (2015, May)

Automatic testing, although useful, is still quite ineffective against faults that do not cause crashes or uncaught exceptions. In the majority of the cases automatic tests do not include oracles, and ... [more ▼]

Automatic testing, although useful, is still quite ineffective against faults that do not cause crashes or uncaught exceptions. In the majority of the cases automatic tests do not include oracles, and only in some cases they incorporate assertions that encode the observed behavior instead of the intended behavior, that is if the application under test produces a wrong result, the synthesized assertions will encode wrong expectations that match the actual behavior of the application. In this paper we present ZoomIn, a technique that extends the fault-revealing capability of test case generation techniques from crash-only faults to faults that require non-trivial oracles to be detected. ZoomIn exploits the knowledge encoded in the manual tests written by developers and the similarity between executions to automatically determine an extremely small set of suspicious assertions that are likely wrong and thus worth manual inspection. Early empirical results show that ZoomIn has been able to detect 50% of the analyzed non-crashing faults in the Apache Commons Math library requiring the inspection of less than 1.5% of the assertions automatically generated by EvoSuite. [less ▲]

Detailed reference viewed: 174 (12 UL)
Full Text
Peer Reviewed
See detailGenerating Complex and Faulty Test Data Through Model-Based Mutation Analysis
Di Nardo, Daniel UL; Pastore, Fabrizio UL; Briand, Lionel UL

in Software Testing, Verification and Validation (ICST), 2015 IEEE Eighth International Conference on (2015, April)

Testing the correct behaviour of data processing systems in the presence of faulty data is extremely expensive. The data structures processed by these systems are often complex, with many data fields and ... [more ▼]

Testing the correct behaviour of data processing systems in the presence of faulty data is extremely expensive. The data structures processed by these systems are often complex, with many data fields and multiple constraints among them. Software engineers, in charge of testing these systems, have to handcraft complex data files or databases, while ensuring compliance with the multiple constraints to prevent the generation of trivially invalid inputs. In addition, assessing test results often means analysing complex output and log data. Though many techniques have been proposed to automatically test systems based on models, little exists in the literature to support the testing of systems where the complexity is in the data consumed in input or produced in output, with complex constraints between them. In particular, such systems often need to be tested with the presence of faults in the input data, in order to assess the robustness and behaviour of the system in response to such faults. This paper presents an automated test technique that relies upon six generic mutation operators to automatically generate faulty data. The technique receives two inputs: field data and a data model, i.e. a UML class diagram annotated with stereotypes and OCL constraints. The annotated class diagram is used to tailor the behaviour of the generic mutation operators to the fault model that is assumed for the system under test and the environment in which it is deployed. Empirical results obtained with a large data acquisition system in the satellite domain show that our approach can successfully automate the generation of test suites that achieve slightly better instruction coverage than manual testing based on domain expertise. [less ▲]

Detailed reference viewed: 374 (85 UL)