A System Dynamics Model for Human Trust in Automation under Speed and Accuracy Requirements

Research shows that human trust in automation is a key predictor of human reliance on the automation. Several models have been proposed to capture the interplay between trust and reliance and their combined impacts on task performance. Whereas some models assume that trust is affected by automation reliability, others assume that trust is affected by automation speed. In fact, both speed and reliability can be crucial for mission performance, therefore, these models do not represent the interrelationships among automation speed, automation reliability, human decision making, and subsequent effects on mission performance. To address this gap, we propose a system dynamics model which incorporates both the speed and reliability of automation and their combined effects on trust. Our model explicitly represents the speed-accuracy compromise adopted by the subjects to weigh the perceived relative importance of these aspects while evaluating the reliance decision. The model is calibrated and evaluated using data collected from a human experiment in which 33 subjects interacted with an automated aid for swarm supervision in a foraging mission. The simulation results show that the model can closely replicate and predict the experimental data in terms of the reliance rate and the number of targets collected. Model limitations and further efforts for model extension are discussed.

[1]  Mary L. Cummings,et al.  Holistic modelling for human-autonomous system interaction , 2015 .

[2]  John D. Lee,et al.  Trust in Automation: Designing for Appropriate Reliance , 2004, Hum. Factors.

[3]  Christina F. Rusnock,et al.  Quantifying Compliance and Reliance Trust Behaviors to Influence Trust in Human-Automation Teams , 2017 .

[4]  J Swanson,et al.  Business Dynamics—Systems Thinking and Modeling for a Complex World , 2002, J. Oper. Res. Soc..

[5]  L. Phillips,et al.  Impulsivity and speed-accuracy strategies in intelligence test performance , 1995 .

[6]  David F. Lohman,et al.  Individual differences in errors and latencies on cognitive tasks , 1989 .

[7]  Catholijn M. Jonker,et al.  Formal Analysis of Models for the Dynamics of Trust Based on Experiences , 1999, MAAMAW.

[8]  Raja Parasuraman,et al.  Humans and Automation: Use, Misuse, Disuse, Abuse , 1997, Hum. Factors.

[9]  Neera Jain,et al.  Dynamic modeling of trust in human-machine interactions , 2017, 2017 American Control Conference (ACC).

[10]  Gregory Dudek,et al.  OPTIMo: Online Probabilistic Trust Inference Model for Asymmetric Human-Robot Collaborations , 2015, 2015 10th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[11]  Ji Gao,et al.  Extending the decision field theory to model operators' reliance on automation in supervisory control situations , 2006, IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans.

[12]  Andrew S. Clare,et al.  Modeling the Impact of Operator Trust on Performance in Multiple Robot Control , 2013, AAAI Spring Symposium: Trust and Autonomous Systems.

[13]  Hussein A. Abbass,et al.  Mixed Initiative Systems for Human-Swarm Interaction: Opportunities and Challenges , 2018, 2018 2nd Annual Systems Modelling Conference (SMC).

[14]  Frank Goldhammer,et al.  Measuring Ability, Speed, or Both? Challenges, Psychometric Solutions, and What Can Be Gained From Experimental Control , 2015, Measurement : interdisciplinary research and perspectives.