Introducing SMRTT: A Structural Equation Model of Multimodal Real-Time Trust

Advances in autonomous technology have led to an increased interest in human-autonomy interactions. Generally, the success of these interactions is measured by the joint performance of the AI and the human operator. This performance depends, in part, on the operator having appropriate, or calibrated, trust of the autonomy. Optimizing the performance of human-autonomy teams therefore partly relies on the modeling and measuring of human trust. Theories and models have been developed on the factors influencing human trust in order to properly measure it. However, these models often rely on self-report rather than more objective, and real-time behavioral and physiological data. This paper seeks to build off of theoretical frameworks of trust by adding objective data to create a model capable of finer grain temporal measures of trust. Presented herein is SMRTT: SEM of Multimodal Real Time Trust. SMRTT leverages Structured Equation Modeling (SEM) techniques to arrive at a real time model of trust. Variables and factors from previous studies and existing theories are used to create components of SMRTT. The value of adding physiological data to the models to create real-time monitoring is discussed along with future plans to validate this model.

[1]  John D. Lee,et al.  Trust in Automation: Designing for Appropriate Reliance , 2004, Hum. Factors.

[2]  Kristin E. Schaefer,et al.  Measuring Trust in Human Robot Interactions: Development of the “ Trust Perception Scale-HRI ” , 2016 .

[3]  Joseph B. Lyons,et al.  Individual differences in human–machine trust: A multi-study look at the perfect automation schema , 2018, Theoretical Issues in Ergonomics Science.

[4]  A. Tversky,et al.  Extensional versus intuitive reasoning: the conjunction fallacy in probability judgment , 1983 .

[5]  Joseph Williams,et al.  Machine Learning and Physiological Metrics Enhance Performance Assessment , 2019 .

[6]  Stephen B Gilbert,et al.  Do You Need to Travel? Mapping Face-to-Face Communication Objectives to Technology Affordances , 2020 .

[7]  D. Wiegmann,et al.  Similarities and differences between human–human and human–automation trust: an integrative review , 2007 .

[8]  Deborah Lee,et al.  Measuring Individual Differences in the Perfect Automation Schema , 2015, Hum. Factors.

[9]  J. Lewis,et al.  The Social Dynamics of Trust: Theoretical and Empirical Research, 1985–2012 , 2012 .

[10]  Nisar R. Ahmed,et al.  “Dave...I can assure you ...that it’s going to be all right ...” A Definition, Case for, and Survey of Algorithmic Assurances in Human-Autonomy Trust Relationships , 2017, ACM Comput. Surv..

[11]  Charles J. Kacmar,et al.  The impact of initial consumer trust on intentions to transact with a web site: a trust building model , 2002, J. Strateg. Inf. Syst..

[12]  Masooda N. Bashir,et al.  Trust in Automation: Integrating Empirical Evidence on Factors That Influence Trust , 2015, Hum. Factors.