Scenario-Based Validation: Beyond the User Requirements Notation

A quality-driven approach to software development and testing demands that, ultimately, the requirements of stakeholders be validated against the actual behavior of an implementation under test (IUT). In model-based testing, much work has been done on the generation of functional test cases. But few approaches tackle the executability of such test cases. And those that do, offer a solution in which test cases are not directly traceable back to the actual behavior and components of an IUT. Furthermore, extremely few approaches tackle non-functional requirements. Indeed, the User Requirements Notation (URN) is one of few proposals that address the modeling and validation of both functional and non-functional requirements. But if the URN is to support traceability and executability of tests cases with respect to an actual IUT, then the “URN puzzle” must be modified: it must be augmented with a testable model for functional and non-functional requirements, an IUT, and explicit bindings between the two. We explain how these three additions are used in our implemented framework in order to support scenario-based validation.

[1]  Martin Glinz,et al.  SCENT: A Method Employing Scenarios to Systematically Derive TestCases for System Test , 2000 .

[2]  Sebastián Uchitel,et al.  LTSA-WS: a tool for model-based verification of web service compositions and choreography , 2006, ICSE.

[3]  Johannes Ryser,et al.  S CENT : A Method Employing Scenarios to Systematically Derive Test Cases for System Test , 1998 .

[4]  John Mylopoulos,et al.  Why Goal-Oriented Requirements Engineering , 1998, Requirements Engineering: Foundation for Software Quality.

[5]  Sjouke Mauw,et al.  Message Sequence Chart (MSC) , 1996 .

[6]  Bertrand Meyer,et al.  Programs That Test Themselves , 2009, Computer.

[7]  Andrew Miga,et al.  Application of Use Case Maps to System Design With Tool Support , 1998 .

[8]  Kent L. Beck,et al.  Test-driven Development - by example , 2002, The Addison-Wesley signature series.

[9]  Clémentine Nebut,et al.  Requirements by contracts allow automated system testing , 2003, 14th International Symposium on Software Reliability Engineering, 2003. ISSRE 2003..

[10]  dizayn İç dekor Design by Contract , 2010 .

[11]  Claude Caci,et al.  Testing object-oriented systems , 2000, SOEN.

[12]  MeyerBertrand,et al.  Design by Contract , 1997 .

[13]  Wolfgang Grieskamp,et al.  Multi-paradigmatic Model-Based Testing , 2006, FATES/RV.

[14]  Dorina C. Petriu,et al.  The Future of Software Performance Engineering , 2007, Future of Software Engineering (FOSE '07).

[15]  Bertrand Meyer The unspoken revolution in software engineering , 2006, Computer.

[16]  Jean-Pierre Corriveau,et al.  A Scenario-Driven Approach to Model-Based Testing , 2010 .

[17]  Stéphane S. Somé Use cases based requirements validation with scenarios , 2005, 13th IEEE International Conference on Requirements Engineering (RE'05).

[18]  John Mylopoulos Goal-Oriented Requirements Engineering , 2008, CIbSE.

[19]  Daniel Amyot,et al.  Introduction to the User Requirements Notation: learning by example , 2003, Comput. Networks.

[20]  Antonia Bertolino,et al.  Software Testing Research: Achievements, Challenges, Dreams , 2007, Future of Software Engineering (FOSE '07).

[21]  Margus Veanes,et al.  Model-Based Testing of Object-Oriented Reactive Systems with Spec Explorer , 2008, Formal Methods and Testing.

[22]  R. J. A. Buhr,et al.  Use Case Maps for Object-Oriented Systems , 1995 .

[23]  Bertrand Meyer,et al.  Applying 'design by contract' , 1992, Computer.