An Information System Agnostic Environment for Reproducible Experiments

Designing reproducible experiments with information systems (IS) is a challenging task, mainly due to the complex nature of processes, vendor specific tools and the difficulty to find an adequate abstraction level. While technical challenges (e.g. finding performance bottlenecks) can be investigated in detail using simplified near-life settings or purposive research prototypes, these approaches are not feasible in experiments concerned with IS used in complex environments and problem domains (e.g. governance, risk and compliance management) as the software often cannot be simplified without considerable effort or losing rigor. On the other hand, software used in such domains has almost never been designed by the vendors with researchers’ goals and needs in mind which makes their usage in experiments burdensome. To address this, we present the research prototype of an experimentation environment that simplifies the creation of reproducible experiments around off-the-shelf information systems.

[1]  Andrea C. Arpaci-Dusseau,et al.  Tackling the reproducibility problem in storage systems research with declarative experiment specifications , 2015, PDSW '15.

[2]  J. Ioannidis,et al.  Reproducibility in Science: Improving the Standard for Basic and Preclinical Research , 2015, Circulation research.

[3]  Dirk Merkel,et al.  Docker: lightweight Linux containers for consistent development and deployment , 2014 .

[4]  Carl Boettiger,et al.  An introduction to Docker for reproducible research, with examples from the R environment , 2014, ArXiv.

[5]  Carole A. Goble,et al.  The Software Sustainability Institute: Changing Research Software Attitudes and Practices , 2013, Computing in Science & Engineering.

[6]  Nicole A. Vasilevsky,et al.  On the reproducibility of science: unique identification of research resources in the biomedical literature , 2013, PeerJ.

[7]  Ruth Breu,et al.  Towards an Architecture for Collaborative Cross-Organizational Security Requirements Management , 2013, BIS.

[8]  Ian P. Gent The Recomputation Manifesto , 2013, ArXiv.

[9]  Brian A. Nosek,et al.  An Open, Large-Scale, Collaborative Effort to Estimate the Reproducibility of Psychological Science , 2012, Perspectives on psychological science : a journal of the Association for Psychological Science.

[10]  Bill Howe,et al.  Virtual Appliances, Cloud Computing, and Reproducible Research , 2012, Computing in Science & Engineering.

[11]  Tuure Tuunanen,et al.  Design Science Research Evaluation , 2012, DESRIST.

[12]  R. Peng Reproducible Research in Computational Science , 2011, Science.

[13]  Yair Levy,et al.  A Guide for Novice Researchers on Experimental and Quasi-Experimental Studies in Information Systems Research , 2011 .

[14]  Mikko T. Siponen,et al.  Improving Employees' Compliance Through Information Systems Security Training: An Action Research Study , 2010, MIS Q..

[15]  Joel T Dudley,et al.  In silico research in the era of cloud computing , 2010, Nature Biotechnology.

[16]  Henri Barki,et al.  User Participation in Information Systems Security Risk Management , 2010, MIS Q..

[17]  S. Chatterjee,et al.  Design Science Research in Information Systems , 2010 .

[18]  Harri Oinas-Kukkonen,et al.  A review of information security issues and respective research contributions , 2007, DATB.

[19]  J Hilliard,et al.  Again and Again and Again , 2005 .

[20]  Magnus C. Ohlsson,et al.  Experimentation in Software Engineering , 2000, The Kluwer International Series in Software Engineering.

[21]  Salvatore T. March,et al.  Design and natural science research on information technology , 1995, Decis. Support Syst..