True and false dependence on technology: Evaluation with an expert system

Abstract There is a danger inherent in labeling systems “expert.” Such identification implies some levels of “intelligence” or “understanding” within the confines of the system. It is important to know the limitations of any system, including realistic expectations of the real or implied power of an expert system. The “blindness” or boundaries inherent in expert system development extends to users who may misplace trust in false technology. This study investigates the use of an incorrect advice-giving expert system. Expert and novice engineers used the faulty system to solve a well test interpretation task. Measures of decision confidence, system success, state-anxiety and task difficulty were taken. Subjects expressed confidence in their “wrong” answer to the problem, displaying true dependence on a false technology. These results suggest implications for developers and/or users in areas of certification, evaluation, risk assessment, validation, and verification of systems conveying a level of “expertise.”

[1]  Jay Liebowitz,et al.  Using Expert Systems: The Legal Perspective , 1987, IEEE Expert.

[2]  Albert L. Lederer,et al.  Individual Differences and Decision-making Using Various Levels of Aggregation of Information , 1989, J. Manag. Inf. Syst..

[3]  Ronald N. Taylor Age and Experience as Determinants of Managerial Information Processing and Decision Making Performance , 1975 .

[4]  Jan L. Guynes Impact of system response time on state anxiety , 1988, CACM.

[5]  Hal G. Gueutal Utilizing high technology: Computer-aided-design and user performance , 1989, Inf. Manag..

[6]  S. Dreyfus,et al.  Making a mind versus modeling the brain: aritifical intelligence back at a branchpoint , 1989 .

[7]  Terry Winograd,et al.  Expert Systems: How Far Can They Go? , 1985, IJCAI.

[8]  G Larry Sanders,et al.  MIS/DSS success measure , 1984 .

[9]  Murthi Nanja An investigation of the on-line debugging process of expert and novice student programmers , 1988 .

[10]  Marvin D. Dunnette,et al.  Influence of dogmatism, risk-taking propensity, and intelligence on decision-making strategies for a sample of industrial managers. , 1974 .

[11]  Richard P. Will,et al.  Individual Differences in the Performance and Use of an Expert System , 1992, Int. J. Man Mach. Stud..

[12]  Terry Winograd,et al.  Understanding computers and cognition , 1986 .

[13]  Donald A. Waterman,et al.  A Guide to Expert Systems , 1986 .

[14]  R. Mockler Knowledge-based Systems for Management Decisions , 1988 .

[15]  Neil E. Swanson,et al.  Open versus Closed Minds: The Effect of Dogmatism on an Analyst's Problem-solving Behavior , 1987, J. Manag. Inf. Syst..

[16]  James F. Courtney,et al.  A Field Study of Organizational Factors Influencing DSS Success , 1985, MIS Q..

[17]  B. Adelson When Novices Surpass Experts: The Difficulty of a Task May Increase With Expertise , 1984 .

[18]  Andrew Basden On the Application of Expert Systems , 1983, Int. J. Man Mach. Stud..

[19]  William E. Riddle,et al.  Guest Editors' Introduction Software Engineering Environment Architectures , 1988, IEEE Trans. Software Eng..

[20]  M. Rokeach,et al.  The Open and Closed Mind , 1960 .

[21]  F. Gul Relationship of Dogmatism and Confidence in the Evaluation of Accounting Information , 1983 .

[22]  Alan Bundy,et al.  A Critical Survey of Rule Learning Programs , 1982, ECAI.

[23]  P E Johnson,et al.  What kind of expert should a system be? , 1983, The Journal of medicine and philosophy.

[24]  Daniel J. Power,et al.  AN EMPIRICAL ASSESSMENT OF COMPUTER‐ASSISTED DECISION ANALYSIS* , 1986 .