Reducing verification effort in component-based software engineering through built-in testing

Today component- and service-based technologies play a central role in many aspects of enterprise computing. However, although the technologies used to define, implement, and assemble components have improved significantly over recent years, techniques for verifying systems created from them have changed very little. The correctness and reliability of component-based systems are still usually checked using the traditional testing techniques that were in use before components and services became widespread, and the associated costs and overheads still remain high. This paper presents an approach that addresses this problem by making the system verification process more component-oriented. Based on the notion of built-in tests (BIT)—tests that are packaged and distributed with prefabricated, off-the-shelf components—the approach partially automates the testing process, thereby reducing the level of effort needed to establish the acceptability of the system. The approach consists of a method that defines how components should be written to support and make use of run-time tests, and a resource-aware infrastructure that arranges for tests to be executed when they have a minimal impact on the delivery of system services. After providing an introduction to the principles behind component-based verification and explaining the main features of the approach and its supporting infrastructure, we show by means of a case study how it can reduce system verification effort.

[1]  Yingxu Wang,et al.  A method for built-in tests in component-based software maintenance , 1999, Proceedings of the Third European Conference on Software Maintenance and Reengineering (Cat. No. PR00090).

[2]  Jean-Marc Jézéquel,et al.  Self-testable components: from pragmatic tests to design-for-testability methodology , 1999, Proceedings Technology of Object-Oriented Languages and Systems. TOOLS 29 (Cat. No.PR00275).

[3]  Boris Beizer,et al.  Software testing techniques (2. ed.) , 1990 .

[4]  하수철,et al.  [서평]「Component Software」 - Beyond Object-Oriented Programming - , 2000 .

[5]  Boris Beizer,et al.  Software Testing Techniques , 1983 .

[6]  Volker Gruhn,et al.  Black- and White-Box Self-testing COTS Components , 2004, SEKE.

[7]  Changzhou Wang,et al.  Integrated quality of service (QoS) management in service-oriented enterprise architectures , 2004 .

[8]  Xiaoming Wang,et al.  Chapter 8 , 2003, The Sermons and Liturgy of Saint James.

[9]  Bev Littlewood,et al.  Evaluation of competing software reliability predictions , 1986, IEEE Transactions on Software Engineering.

[10]  Robert V. Binder,et al.  Testing Object-Oriented Systems: Models, Patterns, and Tools , 1999 .

[11]  Bertrand Meyer,et al.  Applying 'design by contract' , 1992, Computer.

[12]  John D. Musa,et al.  Operational profiles in software-reliability engineering , 1993, IEEE Software.

[13]  Leonardo Mariani,et al.  Generation of Integration Tests for Self-Testing Components , 2004, FORTE Workshops.

[14]  Barry W. Boehm,et al.  Verifying and Validating Software Requirements and Design Specifications , 1989, IEEE Software.

[15]  Hans-Gerhard Groß,et al.  Component-based software testing with UML , 2004 .

[16]  Derek McAuley,et al.  Energy is just another resource: energy accounting and energy pricing in the Nemesis OS , 2001, Proceedings Eighth Workshop on Hot Topics in Operating Systems.

[17]  Ralf Reussner,et al.  Contracts and Quality Attributes of Software Components , 2003 .

[18]  John D. Musa,et al.  Software Reliability Engineering: More Reliable Software Faster and Cheaper , 2004 .

[19]  Colin Atkinson,et al.  Ubiquitous RATs: how resource-aware run-time tests can improve ubiquitous software systems , 2006, SEM '06.

[20]  R. Hamlet RANDOM TESTING , 1994 .