The More You Know: Trust Dynamics and Calibration in Highly Automated Driving and the Effects of Take-Overs, System Malfunction, and System Transparency

Objective This paper presents a theoretical model and two simulator studies on the psychological processes during early trust calibration in automated vehicles. Background The positive outcomes of automation can only reach their full potential if a calibrated level of trust is achieved. In this process, information on system capabilities and limitations plays a crucial role. Method In two simulator experiments, trust was repeatedly measured during an automated drive. In Study 1, all participants in a two-group experiment experienced a system-initiated take-over, and the occurrence of a system malfunction was manipulated. In Study 2 in a 2 × 2 between-subject design, system transparency was manipulated as an additional factor. Results Trust was found to increase during the first interactions progressively. In Study 1, take-overs led to a temporary decrease in trust, as did malfunctions in both studies. Interestingly, trust was reestablished in the course of interaction for take-overs and malfunctions. In Study 2, the high transparency condition did not show a temporary decline in trust after a malfunction. Conclusion Trust is calibrated along provided information prior to and during the initial drive with an automated vehicle. The experience of take-overs and malfunctions leads to a temporary decline in trust that was recovered in the course of error-free interaction. The temporary decrease can be prevented by providing transparent information prior to system interaction. Application Transparency, also about potential limitations of the system, plays an important role in this process and should be considered in the design of tutorials and human-machine interaction (HMI) concepts of automated vehicles.

[1]  João Batista Camargo,et al.  Assuring Fully Autonomous Vehicles Safety by Design: The Autonomous Vehicle Control (AVC) Module Strategy , 2017, 2017 47th Annual IEEE/IFIP International Conference on Dependable Systems and Networks Workshops (DSN-W).

[2]  Daniel R. Ilgen,et al.  Not All Trust Is Created Equal: Dispositional and History-Based Trust in Human-Automation Interactions , 2008, Hum. Factors.

[3]  Fredrick Ekman,et al.  To See or Not to See: The Effect of Object Recognition on Users' Trust in "Automated Vehicles" , 2016, NordiCHI.

[4]  Tim Schmitz,et al.  Contrast Analysis Focused Comparisons In The Analysis Of Variance , 2016 .

[5]  Josef F. Krems,et al.  Effects of Take-Over Requests and Cultural Background on Automation Trust in Highly Automated Driving , 2017 .

[6]  I. Ajzen The theory of planned behavior , 1991 .

[7]  Klaus Bengler,et al.  Taking Over Control From Highly Automated Vehicles in Complex Traffic Situations , 2016, Hum. Factors.

[8]  J. Krems,et al.  The first impression counts – A combined driving simulator and test track study on the development of trust and acceptance of highly automated driving , 2019, Transportation Research Part F: Traffic Psychology and Behaviour.

[9]  Raja Parasuraman,et al.  Humans and Automation: Use, Misuse, Disuse, Abuse , 1997, Hum. Factors.

[10]  Deborah Lee,et al.  Measuring Individual Differences in the Perfect Automation Schema , 2015, Hum. Factors.

[11]  Riender Happee,et al.  Advantages and Disadvantages of Driving Simulators: A Discussion , 2012 .

[12]  Toshiyuki Inagaki,et al.  Laboratory studies of trust between humans and machines in automated systems , 1999 .

[13]  Jessie Y. C. Chen,et al.  A Meta-Analysis of Factors Affecting Trust in Human-Robot Interaction , 2011, Hum. Factors.

[14]  Josef F. Krems,et al.  Prior Familiarization With Takeover Requests Affects Drivers’ Takeover Performance and Automation Trust , 2017, Hum. Factors.

[15]  Bonnie M. Muir,et al.  Trust Between Humans and Machines, and the Design of Decision Aids , 1987, Int. J. Man Mach. Stud..

[16]  Regina A. Pomranky,et al.  The role of trust in automation reliance , 2003, Int. J. Hum. Comput. Stud..

[17]  William Payre,et al.  Intention to use a fully automated car: attitudes and a priori acceptability , 2014 .

[18]  Douglas A. Wiegmann,et al.  Effects of Attribute and Goal Framing on Automation Reliance and Compliance , 2005 .

[19]  Darren George,et al.  SPSS for Windows Step by Step: A Simple Guide and Reference , 1998 .

[20]  Christopher D. Wickens,et al.  A model for types and levels of human interaction with automation , 2000, IEEE Trans. Syst. Man Cybern. Part A.

[21]  Li Gong,et al.  How social is social responses to computers? The function of the degree of anthropomorphism in computer representations , 2008, Comput. Hum. Behav..

[22]  Guy H. Walker,et al.  Trust in vehicle technology , 2016 .

[23]  Moritz Körber,et al.  Introduction matters: Manipulating trust in automation and reliance in automated driving. , 2018, Applied ergonomics.

[24]  Philippe Martinet,et al.  Dynamic driving task fallback for an automated driving system whose ability to monitor the driving environment has been compromised , 2017, 2017 IEEE Intelligent Vehicles Symposium (IV).

[25]  J. Krems,et al.  The evolution of mental model, trust and acceptance of adaptive cruise control in relation to initial information , 2013 .

[26]  N Moray,et al.  Trust, control strategies and allocation of function in human-machine systems. , 1992, Ergonomics.

[27]  Philipp Wintersberger,et al.  Traffic Augmentation as a Means to Increase Trust in Automated Driving Systems , 2017, CHItaly.

[28]  A. M. Rich,et al.  Automated diagnostic aids: The effects of aid reliability on users' trust and reliance , 2001 .

[29]  Mark R. Lehto,et al.  Foundations for an Empirically Determined Scale of Trust in Automated Systems , 2000 .

[30]  S. Gregor,et al.  Measuring Human-Computer Trust , 2000 .

[31]  Göran Falkman,et al.  Presenting system uncertainty in automotive UIs for supporting trust calibration in autonomous driving , 2013, AutomotiveUI.

[32]  I. Ajzen,et al.  Belief, Attitude, Intention, and Behavior: An Introduction to Theory and Research , 1977 .

[33]  Charles D. Barrett Understanding Attitudes and Predicting Social Behavior , 1980 .

[34]  John D. Lee,et al.  Trust in Automation: Designing for Appropriate Reliance , 2004, Hum. Factors.

[35]  Ann M. Bisantz,et al.  The impact of cognitive feedback on judgment performance and trust with decision aids , 2008 .

[36]  Johannes Kraus,et al.  Calibration of Trust Expectancies in Conditionally Automated Driving by Brand, Reliability Information and Introductionary Videos: An Online Study , 2018, AutomotiveUI.

[37]  D. Wiegmann,et al.  Similarities and differences between human–human and human–automation trust: an integrative review , 2007 .

[38]  Emanuel Schmider,et al.  Is It Really Robust , 2010 .

[39]  Mark S. Young,et al.  Cooperation between drivers and automation: implications for safety , 2009 .

[40]  Mascha C. van der Voort,et al.  Supporting the Changing Driver’s Task: Exploration of Interface Designs for Supervision and Intervention in Automated Driving , 2016 .

[41]  Jürgen Sauer,et al.  System reliability, performance and trust in adaptable automation. , 2016, Applied ergonomics.

[42]  Christopher A. Miller,et al.  Trust and etiquette in high-criticality automated systems , 2004, CACM.

[43]  Bonnie M. Muir,et al.  Trust in automation. I: Theoretical issues in the study of trust and human intervention in automated systems , 1994 .

[44]  Guy H. Walker,et al.  Designer driving: drivers' conceptual models and level of trust in adaptive cruise control , 2007 .

[45]  Klaus Bengler,et al.  “Take over!” How long does it take to get the driver back into the loop? , 2013 .

[46]  Michael Weber,et al.  From Car-Driver-Handovers to Cooperative Interfaces: Visions for Driver–Vehicle Interaction in Automated Driving , 2017 .

[47]  Lu Wang,et al.  Trust and Reliance on an Automated Combat Identification System , 2009, Hum. Factors.

[48]  Mica R. Endsley,et al.  From Here to Autonomy , 2017, Hum. Factors.

[49]  Kevin Li,et al.  Evaluating Effects of User Experience and System Transparency on Trust in Automation , 2017, 2017 12th ACM/IEEE International Conference on Human-Robot Interaction (HRI.

[50]  N. Moray,et al.  Trust in automation. Part II. Experimental studies of trust and human intervention in a process control simulation. , 1996, Ergonomics.

[51]  J. Krems,et al.  Learning and development of trust, acceptance and the mental model of ACC. A longitudinal on-road study , 2015 .

[52]  Wendy Ju,et al.  Behavioral Measurement of Trust in Automation , 2016 .

[53]  Holly A. Yanco,et al.  Impact of robot failures and feedback on real-time trust , 2013, 2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI).