The Influence of Task Load and Automation Trust on Deception Detection

The purpose of this research was to investigate the effects that user task load level has on the relationship between an individual's trust in and subsequent use of a system's automation. Military decision-makers trust and use information system automation to make many tactical judgments and decisions. In situations of information uncertainty (information warfare environments), decision-makers must remain aware of information reliability issues and temperate their use of system automation if necessary. An individual's task load may have an effect on his use of a system's automation in environments of information uncertainty.It was hypothesized that user task load will have a moderating effect on the positive relationship between system automation trust and use of system automation. Specifically, in situations of information uncertainty (low trust), high task load will have a negative effect on the relationship. To test this hypothesis, an experiment in a simulated command and control micro-world was conducted in which system automation trust and individual task load were manipulated. The findings from the experiment support the positive relationship between automation trust and automation use found in previous research and suggest that task load does have a negative effect on the positive relationship between automation trust and automation use. Experiment participant who incurred a higher task load exhibited an over-reliance on their automated information systems to assist them in their decision-making activities. Such an over-reliance can lead to vulnerabilities of deception and suggests the need for automated deception detection capabilities.

[1]  B. J. Fogg,et al.  Can computers be teammates? , 1996, Int. J. Hum. Comput. Stud..

[2]  K. Giffin The contribution of studies of source credibility to a theory of interpersonal trust in the communication process. , 1967, Psychological bulletin.

[3]  Shoshana Zuboff,et al.  In the Age of the Smart Machine: The Future of Work and Power , 1989 .

[4]  Bernard Barber,et al.  The Logic and Limits of Trust , 1983 .

[5]  John D. Lee,et al.  Trust, self-confidence, and operators' adaptation to automation , 1994, Int. J. Hum. Comput. Stud..

[6]  Gregg H. Gunsch,et al.  The effect of external safeguards on human-information system trust in an information warfare environment , 2001, 36th Annual Hawaii International Conference on System Sciences, 2003. Proceedings of the.

[7]  Jay L. Devore,et al.  Probability and statistics for engineering and the sciences , 1982 .

[8]  April Rasala Lehman,et al.  A Guide to Statistical and Data Analysis Using JMP and JMP IN Software , 1999 .

[9]  Rena A. Conejo The Effects of Highlighting, Validity, and Feature Type on Air-to-Ground Target Acquisition Performance. , 1998 .

[10]  Bonnie M. Muir,et al.  Trust Between Humans and Machines, and the Design of Decision Aids , 1987, Int. J. Man Mach. Stud..

[11]  Michael G. Morris,et al.  User Acceptance of Information Technology: Theories and Models , 1996 .

[12]  Dorothy E. Denning,et al.  Information Warfare And Security , 1998 .

[13]  Bonnie M. Muir,et al.  Trust in automation. I: Theoretical issues in the study of trust and human intervention in automated systems , 1994 .

[14]  B. J. Fogg,et al.  Credibility and computing technology , 1999, CACM.

[15]  N Moray,et al.  Trust, control strategies and allocation of function in human-machine systems. , 1992, Ergonomics.

[16]  Thomas B. Sheridan,et al.  TRUSTWORTHINESS OF COMMAND AND CONTROL SYSTEMS , 1988 .

[17]  N. L. Chervany,et al.  THE MEANINGS OF TRUST , 2000 .

[18]  J. G. Holmes,et al.  Trust in close relationships. , 1985 .

[19]  N. Moray,et al.  Trust in automation. Part II. Experimental studies of trust and human intervention in a process control simulation. , 1996, Ergonomics.

[20]  Kathleen L. Mosier,et al.  Automation Use and Automation Bias , 1999 .

[21]  K. Weick FROM SENSEMAKING IN ORGANIZATIONS , 2021, The New Economic Sociology.

[22]  Victor A. Riley,et al.  Operator reliance on automation: Theory and data. , 1996 .

[23]  T. Bonoma Conflict, cooperation and trust in three power systems , 1976 .

[24]  Raja Parasuraman,et al.  Human-Computer Monitoring , 1987 .

[25]  Kathleen L. Mosier,et al.  Accountability and automation bias , 2000, Int. J. Hum. Comput. Stud..

[26]  Colin Drury,et al.  Studies and Analyses of Vulnerabilities in Aided Adversarial Decision Making , 1998 .