Exploring Trust Barriers to Future Autonomy: A Qualitative Look

Autonomous systems dominate future Department of Defense (DoD) strategic perspectives, yet little is known regarding the trust barriers of these future systems as few exemplars exist from which to appropriately baseline reactions. Most extant DoD systems represent “automated” versus “autonomous” systems, which adds complexity to our understanding of user acceptance of autonomy. The trust literature posits several key trust antecedents to automated systems, with few field applications of these factors into the context of DoD systems. The current paper will: (1) review the trust literature as relevant to acceptance of future autonomy, (2) present the results of a qualitative analysis of trust barriers for two future DoD technologies (Automatic Air Collision Avoidance System [AACAS]; and Autonomous Wingman [AW]), and (3) discuss knowledge gaps for implementing future autonomous systems within the DoD. The study team interviewed over 160 fighter pilots from 4th Generation (e.g., F-16) and 5th Generation (e.g., F-22) fighter platforms to gauge their trust barriers to AACAS and AW. Results show that the trust barriers discussed by the pilots corresponded fairly well to the existing trust challenges identified in the literature, though some nuances were revealed that may be unique to DoD technologies/operations. Some of the key trust barriers included: concern about interference during operational requirements; the need for transparency of intent, function, status, and capabilities/limitations; concern regarding the flexibility and adaptability of the technology; cyber security/hacking potential; concern regarding the added workload associated with the technology; concern for the lack of human oversight/decision making capacity; and doubts regarding the systems’ operational effectiveness. Additionally, the pilots noted several positive aspects of the proposed technologies including: added protection during last ditch evasive maneuvers; positive views of existing fielded technologies such as the Automatic Ground Collision Avoidance System; the potential for added operational capabilities; the potential to transfer risk to the robotic asset and reduce risk to pilots; and the potential for AI to participate in the entire mission process (planning-execution-debriefing). This paper will discuss the results for each technology and will discuss suggestions for implementing future autonomy into the DoD.

[1]  Joseph B. Lyons,et al.  Being Transparent about Transparency: A Model for Human-Robot Interaction , 2013, AAAI Spring Symposium: Trust and Autonomous Systems.

[2]  Richard Pak,et al.  Decision support aids with anthropomorphic characteristics influence trust and performance in younger and older adults , 2012, Ergonomics.

[3]  Huiyang Li,et al.  Human Performance Consequences of Stages and Levels of Automation , 2014, Hum. Factors.

[4]  J. H. Davis,et al.  An Integrative Model Of Organizational Trust , 1995 .

[5]  David E. Smith,et al.  Shaping Trust Through Transparent Design: Theoretical and Experimental Guidelines , 2017 .

[6]  Jessie Y. C. Chen,et al.  A Meta-Analysis of Factors Affecting Trust in Human-Robot Interaction , 2011, Hum. Factors.

[7]  Deborah Lee,et al.  Measuring Individual Differences in the Perfect Automation Schema , 2015, Hum. Factors.

[8]  H. Endo,et al.  Effects of intermittent odours on cognitive-motor performance and brain functioning during mental fatigue , 2012, Ergonomics.

[9]  Donald E. Swihart,et al.  Development of an Automatic Aircraft Collision Avoidance System for Fighter Aircraft , 2013 .

[10]  Jessie Y. C. Chen,et al.  Human–Agent Teaming for Multirobot Control: A Review of Human Factors Issues , 2014, IEEE Transactions on Human-Machine Systems.

[11]  Masooda Bashir,et al.  Trust in Automation , 2015, Hum. Factors.

[12]  Xin Li,et al.  Why do we trust new technology? A study of initial trust formation with organizational information systems , 2008, J. Strateg. Inf. Syst..

[13]  Joseph B. Lyons,et al.  The Effects of Automation Error Types on Operators' Trust and Reliance , 2016, HCI.

[14]  Russell Turner,et al.  Automatic integrated collision avoidance system , 2017 .

[15]  Joseph B. Lyons,et al.  Trust-Based Analysis of an Air Force Collision Avoidance System , 2016 .

[16]  Joseph B. Lyons,et al.  A Longitudinal Field Study of Auto-GCAS Acceptance and Trust: First-Year Results and Implications , 2017 .

[17]  John D. Lee,et al.  Trust in Automation: Designing for Appropriate Reliance , 2004 .