The Design and Integration of a Comprehensive Measurement System to Assess Trust in Automated Driving

With the increased availability of commercially automated vehicles, trust in automation may serve a critical role in the overall system safety, rate of adoption, and user satisfaction. We developed and integrated a novel measurement system to better calibrate human-vehicle trust in driving. The system was designed to collect a comprehensive set of measures based on a validated model of trust focusing on three types: dispositional, learned, and situational. Our system was integrated into a Tesla Model X to assess different automated functions and their effects on trust and performance in real-world driving (e.g., lane changes, parking, and turns). The measurement system collects behavioral, physiological (eye and head movements), and self-report measures of trust using validated instruments. A vehicle telemetry system (Ergoneers Vehicle Testing Kit) uses a suite of sensors for capturing real driving performance data. This off-the-shelf solution is coupled with a custom mobile application for recording driver behaviors, such as engaging/disengaging automation, during on-road driving. Our initial usability evaluations of components of the system revealed that the system is easy to use, and events can be logged quickly and accurately. Our system is thus viable for data collection and can be used to model user trust behaviors in realistic on-road conditions.

[1]  W. J. Bryant,et al.  Automation-Induced Complacency Potential: Development and Validation of a New Scale , 2019, Front. Psychol..

[2]  Josef F. Krems,et al.  Keep Your Scanners Peeled , 2016, Hum. Factors.

[3]  Natassia Goode,et al.  Is it really good to talk? Testing the impact of providing concurrent verbal protocols on driving performance , 2017, Ergonomics.

[4]  Anthony J. Ries,et al.  Calibrating Trust in Automation Through Familiarity With the Autoparking Feature of a Tesla Model X , 2019, Journal of Cognitive Engineering and Decision Making.

[5]  Elem Güzel Kalayci,et al.  Triangulated investigation of trust in automated driving: Challenges and solution approaches for data integration , 2021, J. Ind. Inf. Integr..

[6]  Josef F. Krems,et al.  Effects of Take-Over Requests and Cultural Background on Automation Trust in Highly Automated Driving , 2017 .

[7]  Gregory L. Stuart,et al.  Evaluation of a behavioral measure of risk taking: the Balloon Analogue Risk Task (BART). , 2002, Journal of experimental psychology. Applied.

[8]  R. Pak,et al.  Looking for Age Differences in Self-Driving Vehicles: Examining the Effects of Automation Reliability, Driving Risk, and Physical Impairment on Trust , 2019, Front. Psychol..

[9]  I. Nourbakhsh,et al.  On the future of transportation in an era of automated and autonomous vehicles , 2019, Proceedings of the National Academy of Sciences.

[10]  Klaus Bengler,et al.  Trust in Automation – Before and After the Experience of Take-over Scenarios in a Highly Automated Vehicle☆ , 2015 .

[11]  J. Kraus,et al.  What’s Driving Me? Exploration and Validation of a Hierarchical Personality Model for Trust in Automated Driving , 2020, Hum. Factors.

[12]  Anthony J. Ries,et al.  Trust and Distrust of Automated Parking in a Tesla Model X , 2020, Hum. Factors.

[13]  Daniel R. Ilgen,et al.  Not All Trust Is Created Equal: Dispositional and History-Based Trust in Human-Automation Interactions , 2008, Hum. Factors.

[14]  Michael G. Lenné,et al.  Driver trust & mode confusion in an on-road study of level-2 automated vehicle technology , 2020 .

[15]  Ewart J. de Visser,et al.  Let Tesla Park Your Tesla: Driver Trust in a Semi-Automated Car , 2019, 2019 Systems and Information Engineering Design Symposium (SIEDS).

[16]  M. Endsley Autonomous Driving Systems: A Preliminary Naturalistic Study of the Tesla Model S , 2017 .

[17]  John G. Gaspar,et al.  The Effect of Partial Automation on Driver Attention: A Naturalistic Driving Study , 2019, Hum. Factors.

[18]  Deborah Lee,et al.  Measuring Individual Differences in the Perfect Automation Schema , 2015, Hum. Factors.

[19]  John D. Lee,et al.  Trust in Automation: Designing for Appropriate Reliance , 2004 .

[20]  The Good, The Bad, and The Ugly: Evaluating Tesla’s Human Factors in the Wild West of Self-Driving Cars , 2020 .

[21]  Masooda Bashir,et al.  Trust in Automation , 2015, Hum. Factors.

[22]  Mark A. Neerincx,et al.  Towards a Theory of Longitudinal Trust Calibration in Human–Robot Teams , 2019, International Journal of Social Robotics.

[23]  Raja Parasuraman,et al.  Complacency and Bias in Human Use of Automation: An Attentional Integration , 2010, Hum. Factors.

[24]  Mukesh Singhal,et al.  Do You Want Your Autonomous Car to Drive Like You? , 2015, 2017 12th ACM/IEEE International Conference on Human-Robot Interaction (HRI.

[25]  Mukesh Singhal,et al.  Trust Dynamics in Human Autonomous Vehicle Interaction: A Review of Trust Models , 2016, AAAI Spring Symposia.

[26]  Jordan Navarro,et al.  From manual to automated driving: how does trust evolve? , 2020 .

[27]  Katja Kircher,et al.  Performance of an Additional Task During Level 2 Automated Driving: An On-Road Study Comparing Drivers With and Without Experience With Partial Automation , 2018, Hum. Factors.

[28]  Joshua E. Domeyer,et al.  Assessing Drivers’ Trust of Automated Vehicle Driving Styles With a Two-Part Mixed Model of Intervention Tendency and Magnitude , 2019, Hum. Factors.

[29]  Neville A. Stanton,et al.  Human error taxonomies applied to driving: A generic driver error taxonomy and its implications for intelligent transport systems , 2009 .

[30]  P. Madhavan,et al.  Initial validation of the trust of automated systems test (TOAST) , 2020, The Journal of social psychology.