Spreadsheet Presentation and Error Detection: An Experimental Study

The pervasiveness and impact of electronic spreadsheets have generated serious concerns about their integrity and validity when used in significant decision-making settings. Previous studies have shown that few of the errors that might exist in any given spreadsheet are found, even when the reviewer is explicitly looking for errors. It was hypothesized that differences in the spreadsheets' presentation and their formulas could affect the detection rate of these errors. A sample of 113 M.B.A. students volunteered to search for eight errors planted in a one-page spreadsheet. The spreadsheet was presented in five different formats. A 2 × 2 design specified that four groups were given apparently conventional spreadsheets for comparing paper and screen and the presence or absence of formulas. A fifth group received a special printed spreadsheet with formulas visibly integrated into the spreadsheet-printed in a small font directly under the resultant values. As in previous studies, only about 50 percent of the errors were found overall. Subjects with printed spreadsheets found more errors than their colleagues with screen-only spreadsheets but they took longer to do so. There was no discernible formula effect; subjects who were able to refer to formulas did not outperform subjects with access to only the final numbers. The special format did not facilitate error finding. Exploratory analysis uncovered some interesting results. The special experimental integrated paper format appeared to diminish the number of correct items falsely identified as errors. There also seemed to be differences in performance that were accounted for by the subjects' self-reported error-finding strategy. Researchers should focus on other factors that might facilitate error finding, and practitioners should be cautious about relying on spreadsheets' accuracy, even those that have been "audited."

[1]  Victor R. Basili,et al.  Four Applications of a Software Data Collection and Analysis Methodology , 1986 .

[2]  Carol V. Brown,et al.  Perceived risks and management actions: differences in end-user application development across functional groups , 1996, Proceedings of HICSS-29: 29th Hawaii International Conference on System Sciences.

[3]  Raymond R. Panko,et al.  Hitting the wall: errors in developing and debugging a "simple" spreadsheet model , 1996, Proceedings of HICSS-29: 29th Hawaii International Conference on System Sciences.

[4]  John F. Rockart,et al.  The management of end user computing , 1983, CACM.

[5]  David G. Hendry,et al.  Creating, comprehending and explaining spreadsheets: a cognitive interpretation of what discretionary users think of the spreadsheet model , 1994, Int. J. Hum. Comput. Stud..

[6]  Donald P. Ballou,et al.  Implications of data quality for spreadsheet analysis , 1987, DATB.

[7]  Glenford J. Myers,et al.  A controlled experiment in program testing and code walkthroughs/inspections , 1978, CACM.

[8]  V Barnes,et al.  Reading Is Slower from CRT Displays than from Paper: Attempts to Isolate a Single-Variable Explanation , 1987, Human factors.

[9]  Raymond R. Panko Are Two Heads Better than One (At Reducing Spreadsheet Errors) , 1996 .

[10]  Dennis F. Galletta,et al.  An empirical study of spreadsheet error-finding performance , 1993 .

[11]  Gary Moore,et al.  Expansion and Control of End-User Computing , 1988, J. Manag. Inf. Syst..

[12]  Sonya Symons,et al.  Locating Discrete Information in Text: Effects of Computer Presentation and Menu Formatting , 1992 .

[13]  Dennis F. Galletta,et al.  A model of end-user computing policy: Context, process, content and compliance , 1992, Inf. Manag..

[14]  Stephen E. Fienberg,et al.  Discrete Multivariate Analysis: Theory and Practice , 1976 .

[15]  Malcolm King,et al.  Spreadsheet Modelling Abuse: An Opportunity for OR? , 1993 .

[16]  John D. Gould,et al.  Reading from CRT Displays Can Be as Fast as Reading from Paper , 1987 .

[17]  John D. Gould,et al.  An experimental study of people creating spreadsheets , 1987, TOIS.

[18]  Raymond R. Panko,et al.  Spreadsheets on trial: a survey of research on spreadsheet risks , 1996, Proceedings of HICSS-29: 29th Hawaii International Conference on System Sciences.

[19]  Allen Newell,et al.  The psychology of human-computer interaction , 1983 .

[20]  Dennis F. Galletta,et al.  An experimental study of spreadsheet presentation and error detection , 1996, Proceedings of HICSS-29: 29th Hawaii International Conference on System Sciences.

[21]  Dennis F. Galletta,et al.  A Role Theory Perspective on End-User Development , 1990, Inf. Syst. Res..

[22]  Joline Morrison,et al.  Factors influencing risks and outcomes in end-user development , 1996, Proceedings of HICSS-29: 29th Hawaii International Conference on System Sciences.

[23]  Jozef K. Skwirzynski Software System Design Methods , 1986, NATO ASI Series.

[24]  Andrew Dillon,et al.  Reading from paper versus screens: a critical review of the empirical literature , 1992 .

[25]  Ron Oliver,et al.  Proof-Reading on Paper and Screens: The Influence of Practice and Experience on Performance. , 1993 .

[26]  Ronald Pearson Lies, damned lies, and spreadsheets , 1988 .

[27]  Maryam Alavi,et al.  Strategies for End-User Computing: An Integrative Framework , 1987, J. Manag. Inf. Syst..

[28]  Douglas R. Vogel,et al.  An investigation of the information center from the user's perspective , 1985, DATB.

[29]  Mary Sumner,et al.  The Relationship of Application Risks to Application Controls: A Study of Microcomputer-based Spreadsheet Applications , 1994 .

[30]  S. Ditlea,et al.  Spreadsheets can be hazardous to your health , 1987 .

[31]  Maria Jean Johnstone Hall A risk and control-oriented study of the practices of spreadsheet application developers , 1996, Proceedings of HICSS-29: 29th Hawaii International Conference on System Sciences.