Formal models are increasingly being used as input for automated test generation strategies. Software Cost Reduction (SCR), for example, was designed to detect and correct errors during the requirements phase, also allowing test generation. However, the syntax of SCR and other formalisms are not trivial for non-experts. In this work, we present a strategy for test case generation from natural language requirements that uses SCR as an intermediate and hidden formalism. To minimize textual ambiguity, requirements are written according to a controlled natural language. Syntactically valid requirements are mapped into their semantic representation using case frames, from which SCR specifications are derived. These specifications are then used by the T-VEC tool to generate tests cases. Our strategy was evaluated in four different domains: (i) a vending machine (toy example); (ii) a control system for safety injection in a nuclear power plant (publicly available), (iii) one example provided by our industrial partner Embraer; and (iv) the turn indicator system of Mercedes vehicles (publicly available). As a baseline we considered random testing, and, in general, our strategy outperformed it in terms of performance and mutant-based strength analysis.
[1]
Augusto Sampaio,et al.
Test case generation from natural language requirements based on SCR specifications
,
2013,
SAC '13.
[2]
Jan Peleska,et al.
Model-Based Testing for the Second Generation of Integrated Modular Avionics
,
2011,
2011 IEEE Fourth International Conference on Software Testing, Verification and Validation Workshops.
[3]
Michael D. Ernst,et al.
Feedback-Directed Random Test Generation
,
2007,
29th International Conference on Software Engineering (ICSE'07).