Developing human-machine trust: Impacts of prior instruction and automation failure on driver trust in partially automated vehicles

Abstract To prompt the use of driving automation in an appropriate and safe manner, system designers require knowledge about the dynamics of driver trust. To enhance this knowledge, this study manipulated prior information of a partial driving automation into two types (detailed and less) and investigated the effects of the information on the development of trust with respect to three trust attributions proposed by Muir (1994): predictability, dependability, and faith. Furthermore, a driving simulator generated two types of automation failures (limitation and malfunction), and at six instances during the study, 56 drivers completed questionnaires about their levels of trust in the automation. Statistical analysis found that trust ratings of automation steadily increased with the experience of simulation regardless of the drivers’ levels of knowledge. Automation failure led to a temporary decrease in trust ratings; however, the trust was rebuilt by a subsequent experience of flawless automation. Results showed that dependability was the most dominant belief of drivers’ trust throughout the whole experiment, regardless of their knowledge level. Interestingly, detailed analysis indicated that trust can be accounted by different attributions depending on the drivers’ circumstances: the subsequent experience of error-free automation after the exposure to automation failure led predictability to be a secondary predictive attribution of drivers’ trust in the detailed group whilst faith was consistently the secondary contributor to shaping trust in the less group throughout the experiment. These findings have implications for system design regarding transparency and for training methods and instruction aimed at improving driving safety in traffic environments with automated vehicles.

[1]  Shelby K. Long,et al.  Revisiting human-machine trust: a replication study of Muir and Moray (1996) using a simulated pasteurizer plant task , 2021, Ergonomics.

[2]  Bonnie M. Muir,et al.  Trust Between Humans and Machines, and the Design of Decision Aids , 1987, Int. J. Man Mach. Stud..

[3]  Johannes Kraus,et al.  Psychological processes in the formation and calibration of trust in automation , 2020 .

[4]  Linda G. Pierce,et al.  The Perceived Utility of Human and Automated Aids in a Visual Detection Task , 2002, Hum. Factors.

[5]  J. Kraus,et al.  What’s Driving Me? Exploration and Validation of a Hierarchical Personality Model for Trust in Automated Driving , 2020, Hum. Factors.

[6]  Jessie Y. C. Chen,et al.  A Meta-Analysis of Factors Affecting Trust in Human-Robot Interaction , 2011, Hum. Factors.

[7]  Douglas A. Wiegmann,et al.  Effects of Attribute and Goal Framing on Automation Reliance and Compliance , 2005 .

[8]  Masooda Bashir,et al.  Trust in Automation , 2015, Hum. Factors.

[9]  J. G. Holmes,et al.  Trust in close relationships. , 1985 .

[10]  Jacob Haspiel,et al.  Look Who's Talking Now: Implications of AV's Explanations on Driver's Trust, AV Preference, Anxiety and Mental Workload , 2019, Transportation Research Part C: Emerging Technologies.

[11]  Klaus Bengler,et al.  Why Do I Have to Drive Now? Post Hoc Explanations of Takeover Requests , 2018, Hum. Factors.

[12]  Josef F. Krems,et al.  Keep Your Scanners Peeled , 2016, Hum. Factors.

[13]  S Lewandowsky,et al.  The dynamics of trust: comparing humans to automation. , 2000, Journal of experimental psychology. Applied.

[14]  Brian Smith,et al.  Development and validation of a questionnaire to assess pedestrian receptivity toward fully autonomous vehicles , 2017 .

[15]  Yong Gu Ji,et al.  Investigating the Importance of Trust on Adopting an Autonomous Vehicle , 2015, Int. J. Hum. Comput. Interact..

[16]  Holly A. H. Handley,et al.  Trust and the Compliance–Reliance Paradigm: The Effects of Risk, Error Bias, and Reliability on Trust and Dependence , 2017, Hum. Factors.

[17]  N Moray,et al.  Trust, control strategies and allocation of function in human-machine systems. , 1992, Ergonomics.

[18]  William Payre,et al.  Fully Automated Driving , 2016, Hum. Factors.

[19]  D. Wiegmann,et al.  Similarities and differences between human–human and human–automation trust: an integrative review , 2007 .

[20]  Birsen Donmez,et al.  Driver Takeover Performance and Monitoring Behavior with Driving Automation at System-Limit versus System-Malfunction Failures , 2020, Transportation Research Record: Journal of the Transportation Research Board.

[21]  Raja Parasuraman,et al.  Humans and Automation: Use, Misuse, Disuse, Abuse , 1997, Hum. Factors.

[22]  Toshiyuki Inagaki,et al.  Laboratory studies of trust between humans and machines in automated systems , 1999 .

[23]  Linda Ng Boyle,et al.  Towards a Personalized Trust Model for Highly Automated Driving , 2016, Mensch & Computer Workshopband.

[24]  J. Rouder,et al.  Default Bayes Factors for Model Selection in Regression , 2012, Multivariate behavioral research.

[25]  Thomas B. Sheridan,et al.  Man-machine systems;: Information, control, and decision models of human performance , 1974 .

[26]  Shelby K. Long,et al.  Human–Automation Trust to Technologies for Naïve Users Amidst and Following the COVID-19 Pandemic , 2020, Hum. Factors.

[27]  Linda Ng Boyle,et al.  Extending the Technology Acceptance Model to assess automation , 2011, Cognition, Technology & Work.

[28]  William Payre,et al.  Intention to use a fully automated car: attitudes and a priori acceptability , 2014 .

[29]  Guy H. Walker,et al.  Trust in vehicle technology , 2016 .

[30]  W. T. Singleton,et al.  Man-machine systems , 1974 .

[31]  Catherine M. Burns,et al.  Autonomous Driving in the Real World: Experiences with Tesla Autopilot and Summon , 2016, AutomotiveUI.

[32]  J. Krems,et al.  The first impression counts – A combined driving simulator and test track study on the development of trust and acceptance of highly automated driving , 2019, Transportation Research Part F: Traffic Psychology and Behaviour.

[33]  Bobbie D. Seppelt,et al.  Keeping the driver in the loop: Dynamic feedback to support appropriate use of imperfect vehicle control automation , 2019, Int. J. Hum. Comput. Stud..

[34]  Sarah Sharples,et al.  Understanding Is Key: An Analysis of Factors Pertaining to Trust in a Real-World Automation System , 2018, Hum. Factors.

[35]  Bobbie D. Seppelt,et al.  Making adaptive cruise control (ACC) limits visible , 2007, Int. J. Hum. Comput. Stud..

[36]  John D. Lee,et al.  Trust in Automation: Designing for Appropriate Reliance , 2004, Hum. Factors.

[37]  John D Lee,et al.  Exploring Trust in Self-Driving Vehicles Through Text Analysis , 2020, Hum. Factors.

[38]  Christopher D. Wickens,et al.  The benefits of imperfect diagnostic automation: a synthesis of the literature , 2007 .

[39]  Josef F. Krems,et al.  Prior Familiarization With Takeover Requests Affects Drivers’ Takeover Performance and Automation Trust , 2017, Hum. Factors.

[40]  J. Krems,et al.  The evolution of mental model, trust and acceptance of adaptive cruise control in relation to initial information , 2013 .

[41]  Mark Vollrath,et al.  Improving the Driver–Automation Interaction , 2013, Hum. Factors.

[42]  Raja Parasuraman,et al.  Complacency and Bias in Human Use of Automation: An Attentional Integration , 2010, Hum. Factors.

[43]  Thomas B. Sheridan,et al.  TRUSTWORTHINESS OF COMMAND AND CONTROL SYSTEMS , 1988 .

[44]  James L. Szalma,et al.  A Meta-Analysis of Factors Influencing the Development of Trust in Automation , 2016, Hum. Factors.

[45]  Eva-Maria Messner,et al.  Scared to Trust? – Predicting Trust in Highly Automated Driving by Depressiveness, Negative Self-Evaluations and State Anxiety , 2020, Frontiers in Psychology.

[46]  Knowledge of and trust in advanced driver assistance systems. , 2021, Accident; analysis and prevention.

[47]  Bryan Reimer,et al.  Autonomous Vehicles and Alternatives to Driving: Trust, Preferences, and Effects of Age , 2017 .

[48]  Daniel R. Ilgen,et al.  Not All Trust Is Created Equal: Dispositional and History-Based Trust in Human-Automation Interactions , 2008, Hum. Factors.

[49]  Bako Rajaonah,et al.  The role of intervening variables in driver-ACC cooperation , 2008, Int. J. Hum. Comput. Stud..

[50]  Bonnie M. Muir,et al.  Trust in automation. I: Theoretical issues in the study of trust and human intervention in automated systems , 1994 .

[51]  Johannes Kraus,et al.  The More You Know: Trust Dynamics and Calibration in Highly Automated Driving and the Effects of Take-Overs, System Malfunction, and System Transparency , 2019, Hum. Factors.

[52]  Natasha Merat,et al.  Coming back into the loop: Drivers' perceptual-motor performance in critical events after automated driving. , 2017, Accident; analysis and prevention.

[53]  N. Moray,et al.  Trust in automation. Part II. Experimental studies of trust and human intervention in a process control simulation. , 1996, Ergonomics.

[54]  Joel Johansson,et al.  Automation Expectation Mismatch: Incorrect Prediction Despite Eyes on Threat and Hands on Wheel , 2018, Hum. Factors.

[55]  Joshua E. Domeyer,et al.  Characterizing Driver Trust in Vehicle Control Algorithm Parameters , 2018, Proceedings of the Human Factors and Ergonomics Society Annual Meeting.

[56]  Joel M. Cooper,et al.  Cognitive Underpinnings of Beliefs and Confidence in Beliefs about Fully Automated Vehicles , 2018 .

[57]  M. Lee,et al.  Statistical Evidence in Experimental Psychology , 2011, Perspectives on psychological science : a journal of the Association for Psychological Science.

[58]  D. Woods,et al.  Automation Surprises , 2001 .