Verification and validation of a project collaboration tool

Given that a significant amount of research and development efforts go into the creation of software tools, it is important that the most effective verification and validation methods be applied. Traditional methods for evaluating the accuracy and benefits of collaborative business process platforms created by East and Kirby, to date, do not constitute a sufficient proof that these tools are actually operating as designed or provide maximum possible value to all users. Verification through user interviews and surveys demonstrates that the software is performing as expected under test conditions, but is insufficient to identify off-purpose uses. Validation efforts performed at discrete points in time, such as economic analysis, describe specific cases and require assumptions of generality. Subjective continuous evaluations, such as user-submitted Call Center tickets, provide a continuous but incomplete measure of users' experience. This paper provides a new taxonomy that can help researchers and developers to frame future verification and validation efforts. The four dimensions of this taxonomy are Objectivity, Sample Size, Frequency, and Purpose. Software users can also apply the taxonomy to evaluate the extent to which products have been evaluated beyond the standard case studies typically found in software vendor literature.

[1]  Gonzalo Perez,et al.  Improved Design Review through Web Collaboration , 2004 .

[2]  T. Michael Toole,et al.  BUILDING PERFORMANCE ENGINEERING DURING CONSTRUCTION , 2005 .

[3]  P. Benson Shing,et al.  Validation of a Fast Hybrid Test System with Substructure Tests , 2006 .

[4]  T. H. Tse,et al.  An empirical comparison between direct and indirect test result checking approaches , 2006, SOQUA '06.

[5]  M. G. Staskauskas An experience in the formal verification of industrial software , 1996 .

[6]  Sam S. Y. Wang,et al.  A Systematic Procedure for Flow Model Verification and Validation , 2005 .

[7]  David Gelperin,et al.  The growth of software testing , 1988, CACM.

[8]  E. William East,et al.  Abstracting Lessons Learned from Design Reviews , 1996 .

[9]  John C. Cherniavsky,et al.  Validation, Verification, and Testing of Computer Software , 1982, CSUR.

[10]  Alexander L. Wolf,et al.  Software process validation: quantitatively measuring the correspondence of a process to a model , 1999, TSEM.

[11]  Peter Fettke,et al.  Business Process Modeling Notation , 2008, Wirtschaftsinf..

[12]  E. William East and Jeffrey G. Kirby Standardizing Contractual Information Exchange , 2006 .

[13]  Spiro N. Pollalis,et al.  Promise and Barriers to Technology Enabled and Open Project Team Collaboration , 2005 .

[14]  D.R. Wallace,et al.  Software verification and validation: an overview , 1989, IEEE Software.

[15]  Johannes Mayer,et al.  Proceedings of the 3rd international workshop on Software quality assurance , 2006, FSE 2006.

[16]  Laura N. Lowes,et al.  Evaluation, calibration, and verification of a reinforced concrete beam-column joint model , 2007 .

[17]  Donald K. Hicks,et al.  Improvements in design review management , 1988 .

[18]  P. Pierce Software verification and validation , 1996, IEEE Technical Applications Conference. Northcon/96. Conference Record.

[19]  William Morris The American Heritage dictionary of the English language , 1969 .

[20]  James D. Arthur,et al.  Evaluating the Effectiveness of Independendent Verification and Validation , 1999, Computer.

[21]  E. W. East,et al.  Design Review and Checking System (DrChecks) , 2001 .

[22]  Simon Smith,et al.  Construction Research Congress 2005 , 2005 .

[23]  Samuel B. Williams,et al.  ASSOCIATION FOR COMPUTING MACHINERY , 2000 .

[24]  Liang Y Liu,et al.  Design Review Checking System with Corporate Lessons Learned , 2003 .

[25]  Simon L. Peyton Jones,et al.  Roadmap for enhanced languages and methods to aid verification , 2006, GPCE '06.

[26]  Victor R. Basili,et al.  Iterative and incremental developments. a brief history , 2003, Computer.