The Influence of Risky Conditions in Trust in Autonomous Systems

In order to utilize the full range of benefits of autonomous systems, an understanding of how operators trust an automated system is vital. The level of risk in an environment is an important factor that many have suggested affects trust in automated and autonomous systems, but it has not been studied extensively. This study aims to explore the effect differing levels of risk can have on trust in an autonomous system with individuals that vary in their own risk profile. Using a UAV management task, participants worked with an autonomous teammate to protect an area from incoming enemies. Risk was assessed by how much money a participant stood to lose by not protecting the safe zone. Results partially support the hypothesis that trust decreases with increased risk, but results vary regarding behavioral trust and subjective trust. Overall, this experiment provides evidence that risk is an important situational factor that affects trust in automation.

[1]  G. Jamieson,et al.  CONSIDERING SUBJECTIVE TRUST AND MONITORING BEHAVIOR IN ASSESSING AUTOMATION-INDUCED “COMPLACENCY” , 2004 .

[2]  Janet E. Miller,et al.  Designing for Human-Centered Systems: Situational Risk as a Factor of Trust in Automation , 2010 .

[3]  Christopher D. Wickens,et al.  A model for types and levels of human interaction with automation , 2000, IEEE Trans. Syst. Man Cybern. Part A.

[4]  Mark R. Lehto,et al.  Foundations for an Empirically Determined Scale of Trust in Automated Systems , 2000 .

[5]  Joseph B. Lyons,et al.  Human–Human Reliance in the Context of Automation , 2012, Hum. Factors.

[6]  J. H. Davis,et al.  An Integrative Model Of Organizational Trust , 1995 .

[7]  Peter Martinsson,et al.  Does stake size matter in trust games , 2005 .

[8]  Raja Parasuraman,et al.  Performance Consequences of Automation-Induced 'Complacency' , 1993 .

[9]  W. Güth,et al.  An experimental analysis of ultimatum bargaining , 1982 .

[10]  Bertrand Munier,et al.  High Stakes and Acceptance Behavior in Ultimatum Bargaining: , 2002 .

[11]  Raja Parasuraman,et al.  Team Performance in Networked Supervisory Control of Unmanned Air Vehicles , 2014, Hum. Factors.

[12]  Joyce E. Berg,et al.  Trust, Reciprocity, and Social History , 1995 .

[13]  Stephanie M. Merritt Affective Processes in Human–Automation Interactions , 2011, Hum. Factors.

[14]  J. H. Davis,et al.  An integrative model of organizational trust, Academy of Management Review, : . , 1995 .

[15]  Raja Parasuraman,et al.  Humans and Automation: Use, Misuse, Disuse, Abuse , 1997, Hum. Factors.

[16]  Stephen V. Burks,et al.  The effect of stakes in distribution experiments , 2005 .

[17]  John D. Lee,et al.  Trust in Automation: Designing for Appropriate Reliance , 2004, Hum. Factors.

[18]  L. Cameron,et al.  Raising the Stakes in the Ultimatum Game: Experimental Evidence From Indonesia , 1999 .

[19]  Mark W. Scerbo,et al.  Automation-induced complacency for monitoring highly reliable systems: the role of task complexity, system experience, and operator trust , 2007 .

[20]  Jonathan D. Cohen,et al.  The Neural Basis of Economic Decision-Making in the Ultimatum Game , 2003, Science.

[21]  M. Deutsch Trust and suspicion , 1958 .

[22]  Fieke Harinck,et al.  When Gains Loom Larger Than Losses , 2007, Psychological science.

[23]  Shawn L. Berman,et al.  The Structure of Optimal Trust: Moral and Strategic Implications , 1999 .

[24]  Gregory L. Stuart,et al.  Evaluation of a behavioral measure of risk taking: the Balloon Analogue Risk Task (BART). , 2002, Journal of experimental psychology. Applied.