The role of domain analysis in prediction instrument development

In order to develop prediction instruments that have sufficient predictive power, it is essential to understand the specific domain the prediction instrument is developed for. This domain analysis is especially important for domains where human behavior, politics, or other soft factors play a role. If these are not well understood, the predictive power of the prediction instrument would be severely affected. In this paper, we provide literature based reasons for the use of domain analysis for the development of prediction instruments, and we discuss the circumstances under which domain analysis is especially important. We present a structured literature review of the actual adoption of domain analysis for predictive analytics. That shows that few papers discuss how domain analysis was performed, and when it is discussed, the type of analysis often does not fit with the type of domain. As these papers do show adequate predictive power, we believe that the domain analysis in these papers was done implicitly. To make the process of prediction instrument development, including domain analysis more transparent, we present requirements for a method for prediction instrument development , and an outline for such a method based on those requirements.

[1]  Michael Jackson,et al.  Towards a System of Systems Methodologies , 1984 .

[2]  Samir Chatterjee,et al.  A Design Science Research Methodology for Information Systems Research , 2008 .

[3]  Richard T. Watson,et al.  Analyzing the Past to Prepare for the Future: Writing a Literature Review , 2002, MIS Q..

[4]  Galit Shmueli,et al.  Predictive Analytics in Information Systems Research , 2010, MIS Q..

[5]  Maurice van Keulen,et al.  Process Prediction in Noisy Data Sets: A Case Study in a Dutch Hospital , 2012, SIMPDA.

[6]  Peter Checkland,et al.  Soft Systems Methodology , 2020, Systems Approaches to Making Change: A Practical Guide.

[7]  Thomas L. Saaty,et al.  DECISION MAKING WITH THE ANALYTIC HIERARCHY PROCESS , 2008 .

[8]  Alexander Serenko,et al.  Comparing the expert survey and citation impact journal ranking methods: Example from the field of Artificial Intelligence , 2011, J. Informetrics.

[9]  Nikolaos A. Mylonopoulos,et al.  On site: global perceptions of IS journals , 2001, CACM.

[10]  Luis Rabelo,et al.  Using system dynamics, neural nets, and eigenvalues to analyse supply chain behaviour. A case study , 2008 .

[11]  Chintan Amrit,et al.  Predictive analytics for truck arrival time estimation: a field study at a European distribution centre , 2017, Int. J. Prod. Res..

[12]  Jay W. Forrester,et al.  System dynamics, systems thinking, and soft OR , 1994 .

[13]  Timothy D. Fry,et al.  Outlets for operations management research: a DEA assessment of journal quality and rankings , 2013 .

[14]  Philip S. Yu,et al.  Domain Driven Data Mining , 2015 .

[15]  Joe B. Hanna,et al.  An analysis of the value of supply chain management periodicals , 2009 .

[16]  S. New The scope of supply chain management research , 1997 .

[17]  Galit Shmueli,et al.  To Explain or To Predict? , 2010, 1101.0891.

[18]  Alan R. Hevner,et al.  POSITIONING AND PRESENTING DESIGN SCIENCE RESEARCH FOR MAXIMUM IMPACT 1 , 2013 .

[19]  Viswanath Venkatesh,et al.  Bridging the Qualitative-Quantitative Divide: Guidelines for Conducting Mixed Methods Research in Information Systems , 2013, MIS Q..

[20]  Jack R. Meredith,et al.  Knowledge dissemination in operations management: Published perceptions versus academic reality , 2011 .

[21]  R. Handfield,et al.  Impact factor as a metric to assess journals where OM research is published , 2012 .