The case for design science utility and quality - Evaluation of design science artifact within the sustainable ICT capability maturity framework

In Design Science Research (DSR), evaluation of research outputs in form of design artifacts has been discussed in numerous publications. Many researchers have emphasised the criteria of utility for design artifacts, whereas recent approaches have extended this view to other criteria such as efficiency, consistency, accuracy, performance and reliability to mention a few. In this paper we revisit the evaluation discussion in design science and describe a practical oriented evaluation framework. In order to incorporate the evaluation along on-going design cycles, we propose to build on work related to semiotic and information quality. We argue that design science research is usually complex and requires a more detailed evaluation approach. We review literature and follow a semiotic framework for managing knowledge in complex environments. Together with well established information quality criteria we demonstrate that the proposed framework can help to provide a practical evaluation approach within a complex design environment. The framework was developed within the context of a novel IT Management model. We describe its development during the creation of the maturity model for Sustainable Information and Communication Technology (ICT) called the SICTCapability Maturity Framework (SICT-CMF). The context was selected as it is particular interesting with design artifacts created within an open and complex innovation community. The research acknowledges the importance of the utility criteria, but also includes other criteria such as artifact quality, consistency and accuracy in form of a more differentiated view of design science evaluation.

[1]  J. Aken Management Research as a Design Science: Articulating the Research Products of Mode 2 Knowledge Production in Management , 2005 .

[2]  Douglas D. Heckathorn,et al.  Respondent-driven sampling : A new approach to the study of hidden populations , 1997 .

[3]  D. Leonard-Barton,et al.  Wellsprings of Knowledge: Building and Sustaining the Sources of Innovation , 1995 .

[4]  Wolfgang Johannsen,et al.  Referenzmodelle für IT-Governance , 2007 .

[5]  Graeme G. Shanks,et al.  Improving the quality of data models: empirical validation of a quality management framework , 2003, Inf. Syst..

[6]  D. Schoen,et al.  The Reflective Practitioner: How Professionals Think in Action , 1985 .

[7]  James R Cook Engaged Scholarship: A Guide for Organizational and Social Research , 2014 .

[8]  Edward Toomer,et al.  Qualitative Methods in Management Research , 1989 .

[9]  Lars Mathiassen,et al.  Engaged Scholarship in IS Research , 2008, Scand. J. Inf. Syst..

[10]  Diane M. Strong,et al.  AIMQ: a methodology for information quality assessment , 2002, Inf. Manag..

[11]  Donald A. Schön,et al.  The Reflective Practitioner: How Professionals Think in Action. , 1987 .

[12]  Carole A. Goble,et al.  Quality, trust, and utility of scientific data on the web: towards a joint model , 2011, WebSci '11.

[13]  Herbert Snyder,et al.  Qualitative interviewing: The art of hearing data , 1996 .

[14]  Markus Helfert,et al.  The IT-CMF: A Practical Application of Design Science , 2010, DESRIST.

[15]  Samir Chatterjee,et al.  A Design Science Research Methodology for Information Systems Research , 2008 .

[16]  Martin G. Curley,et al.  Managing Information Technology for Business Value: Practical Strategies for IT and Business Managers (IT Best Practices series) , 2004 .

[17]  Juhani Iivari,et al.  A Paradigmatic Analysis of Information Systems As a Design Science , 2007, Scand. J. Inf. Syst..

[18]  Alan R. Hevner,et al.  Design Science in Information Systems Research , 2004, MIS Q..

[19]  S. Chatterjee,et al.  Design Science Research in Information Systems , 2010 .

[20]  Henry Chesbrough,et al.  Open Innovation: The New Imperative for Creating and Profiting from Technology , 2003 .

[21]  Bonnie Kaplan,et al.  Combining Qualitative and Quantitative Methods in Information Systems Research: A Case Study , 1988, MIS Q..

[22]  Kecheng Liu,et al.  Semiotics in Information Systems Engineering , 2000 .

[23]  C. Cassell,et al.  Essential guide to qualitative methods in organizational research , 2004 .

[24]  Herbert A. Simon,et al.  The Sciences of the Artificial , 1970 .

[25]  Martin Höst,et al.  A review of methods for evaluation of maturity models for process improvement , 2012, J. Softw. Evol. Process..

[26]  R. Stamper Information in business and administrative systems , 1973 .

[27]  Jan Pries-Heje,et al.  A Comprehensive Framework for Evaluation in Design Science Research 1 , 2022 .

[28]  Göran Goldkuhl,et al.  DESIGN THEORIES IN INFORMATION SYSTEMS - A NEED FOR MULTI-GROUNDING , 2004 .

[29]  Peter M. Chisnall,et al.  Questionnaire Design, Interviewing and Attitude Measurement , 1993 .

[30]  Jan Pries-Heje,et al.  Strategies for Design Science Research Evaluation , 2008, ECIS.

[31]  A. N. Oppenheim,et al.  Questionnaire Design, Interviewing and Attitude Measurement , 1992 .

[32]  Watts S. Humphrey,et al.  Managing the software process , 1989, The SEI series in software engineering.

[33]  Marian Carcary,et al.  Design Science Research: The Case of the IT Capability Maturity Framework (IT CMF) , 2011 .

[34]  Izak Benbasat,et al.  Empirical Research in Information Systems: The Practice of Relevance , 1999, MIS Q..

[35]  Jörg Becker,et al.  Developing Maturity Models for IT Management , 2009, Bus. Inf. Syst. Eng..

[36]  H. Rubin,et al.  Qualitative Interviewing: The Art of Hearing Data , 1995 .

[37]  Markus Helfert,et al.  Design Science in Action: Researching and Developing the IT-CMF , 2011 .

[38]  Robert Winter,et al.  Situational method engineering for governance, risk and compliance information systems , 2009, DESRIST.

[39]  Paul R. Carlile,et al.  Transferring, Translating, and Transforming: An Integrative Framework for Managing Knowledge Across Boundaries , 2004, Organ. Sci..

[40]  John R. Venable,et al.  A framework for Design Science research activities , 2006 .

[41]  Diane M. Strong,et al.  Beyond Accuracy: What Data Quality Means to Data Consumers , 1996, J. Manag. Inf. Syst..

[42]  Pär J. Ågerfalk Grounding through operationalization: constructing tangible theory in IS research , 2004, ECIS.

[43]  Vijay K. Vaishnavi,et al.  A validation framework for a maturity measurement model for safety-critical software systems , 1998, ACM-SE 36.

[44]  Paul B. Kantor,et al.  Cross-Evaluation: A new model for information system evaluation , 2006, J. Assoc. Inf. Sci. Technol..

[45]  L. Cohen The Implications of Induction , 2019 .

[46]  Michael D. Myers,et al.  A Set of Principles for Conducting and Evaluating Interpretive Field Studies in Information Systems , 1999, MIS Q..

[47]  Shirley Gregor,et al.  The Anatomy of a Design Theory , 2007, J. Assoc. Inf. Syst..

[48]  Sandeep Purao,et al.  Action Design Research , 2011, MIS Q..

[49]  J. Marshall Open Innovation: The New Imperative for Creating and Profiting from Technology , 2004 .

[50]  Jörg Becker,et al.  Maturity models in business process management , 2012, Bus. Process. Manag. J..

[51]  Graeme G. Shanks,et al.  A semiotic information quality framework: development and comparative analysis , 2005, J. Inf. Technol..

[52]  Mark C. Paulk,et al.  Capability Maturity Model for Software, Version 1.1 , 1993 .

[53]  A. V. D. Ven,et al.  Engaged Scholarship: A Guide for Organizational and Social Research , 2007 .

[54]  Salvatore T. March,et al.  Design and natural science research on information technology , 1995, Decis. Support Syst..

[55]  Alan R. Hevner,et al.  A Fitness-Utility Model for Design Science Research , 2011, DESRIST.

[56]  Martin Curley,et al.  The IT Transformation at Intel , 2006, MIS Q. Executive.

[57]  Jan Pries-Heje,et al.  The Design Theory Nexus , 2008, MIS Q..

[58]  Markus Helfert,et al.  A Conceptual Framework for Design Science Research , 2011, BIR.