Trust in an Autonomously Driven Simulator and Vehicle Performing Maneuvers at a T-Junction with and Without Other Vehicles

Autonomous vehicle (AV) technology is developing rapidly. Level 3 automation assumes the user might need to respond to requests to retake control. Levels 4 (high automation) and 5 (full automation) do not require human monitoring of the driving task or systems [1]: the AV handles driving functions and makes decisions based on continuously updated information. A gradual switch in the role of the human within the vehicle from active controller to passive passenger comes with uncertainty in terms of trust, which will likely be a key barrier to acceptability, adoption and continued use [2]. Few studies have investigated trust in AVs and these have tended to use driving simulators with Level 3 automation [3, 4]. The current study used both a driving simulator and autonomous road vehicle. Both were operating at Level 3 autonomy although did not require intervention from the user; much like Level 4 systems. Forty-six participants completed road circuits (UK-based) with both platforms. Trust was measured immediately after different types of turns at a priority T-junction, increasing in complexity: e.g., driving left or right out of a T-junction; turning right into a T-junction; presence of oncoming/crossing vehicles. Trust was high across platforms: higher in the simulator for some events and higher in the road AV for others. Generally, and often irrespective of platform, trust was higher for turns involving an oncoming/crossing vehicle(s) than without traffic, possibly because the turn felt more controlled as the simulator and road AVs always yielded, resulting in a delayed maneuver. We also found multiple positive relationships between trust in automation and technology, and trust ratings for most T-junction turn events across platforms. The assessment of trust was successful and the novel findings are important to those designing, developing and testing AVs with users in mind. Undertaking a trial of this scale is complex and caution should be exercised about over-generalizing the findings.

[1]  Klaus Bengler,et al.  Trust in Automation – Before and After the Experience of Take-over Scenarios in a Highly Automated Vehicle☆ , 2015 .

[2]  Graham. Parkhurst,et al.  Manual Takeover and Handover of a Simulated Fully Autonomous Vehicle Within Urban and Extra-Urban Settings , 2017 .

[3]  Detmar W. Straub,et al.  Trust and TAM in Online Shopping: An Integrated Model , 2003, MIS Q..

[4]  M. Endsley Autonomous Driving Systems: A Preliminary Naturalistic Study of the Tesla Model S , 2017 .

[5]  C. Wickens,et al.  Situation Awareness, Mental Workload, and Trust in Automation: Viable, Empirically Supported Cognitive Engineering Constructs , 2008 .

[6]  Mica R. Endsley,et al.  Toward a Theory of Situation Awareness in Dynamic Systems , 1995, Hum. Factors.

[7]  Mica R. Endsley,et al.  From Here to Autonomy , 2017, Hum. Factors.

[8]  Mark W. Scerbo,et al.  Automation-induced complacency for monitoring highly reliable systems: the role of task complexity, system experience, and operator trust , 2007 .

[9]  David C. Nagel,et al.  Pilots of the future: human or computer? , 1985, CACM.

[10]  Christopher D. Wickens,et al.  A model for types and levels of human interaction with automation , 2000, IEEE Trans. Syst. Man Cybern. Part A.

[11]  Peter A. Hancock,et al.  Human factors and safety in the design of intelligent vehicle-highway systems (IVHS) , 1992 .

[12]  M. Endsley Situation Awareness Misconceptions and Misunderstandings , 2015 .

[13]  Francis T. Durso,et al.  Individual Differences in the Calibration of Trust in Automation , 2015, Hum. Factors.

[14]  A. Fisk,et al.  Reliability and Age-Related Effects on Trust and Reliance of a Decision Support Aid , 2004 .

[15]  Jessie Y. C. Chen,et al.  A Meta-Analysis of Factors Affecting Trust in Human-Robot Interaction , 2011, Hum. Factors.

[16]  Jason Bennett Thatcher,et al.  Trust in a specific technology: An investigation of its components and measures , 2011, TMIS.

[17]  Renwick E. Curry,et al.  Flight-deck automation: promises and problems , 1980 .

[18]  Raja Parasuraman,et al.  Humans and Automation: Use, Misuse, Disuse, Abuse , 1997, Hum. Factors.

[19]  John D. Lee,et al.  Trust, self-confidence, and operators' adaptation to automation , 1994, Int. J. Hum. Comput. Stud..

[20]  John D. Lee,et al.  Trust in Automation: Designing for Appropriate Reliance , 2004 .

[21]  Mica R. Endsley,et al.  Overcoming Representational Errors in Complex Environments , 2000, Hum. Factors.

[22]  N. Moray,et al.  Trust in automation. Part II. Experimental studies of trust and human intervention in a process control simulation. , 1996, Ergonomics.

[23]  Colin G. Drury,et al.  Foundations for an Empirically Determined Scale of Trust in Automated Systems , 2000 .

[24]  Masooda Bashir,et al.  Trust in Automation , 2015, Hum. Factors.

[25]  Jacob Cohen,et al.  A power primer. , 1992, Psychological bulletin.

[26]  Danko Nikolić,et al.  SITUATION AWARENESS AS A PREDICTOR OF PERFORMANCE IN EN ROUTE AIR TRAFFIC CONTROLLERS , 1998 .

[27]  Charles T. Scialfa,et al.  Age differences in trust and reliance of a medication management system , 2005, Interact. Comput..

[28]  Mica R. Endsley,et al.  Final Reflections , 2015 .

[29]  Moritz Körber,et al.  Introduction matters: Manipulating trust in automation and reliance in automated driving. , 2018, Applied ergonomics.

[30]  Yong Gu Ji,et al.  Investigating the Importance of Trust on Adopting an Autonomous Vehicle , 2015, Int. J. Hum. Comput. Interact..

[31]  Satish Chandra,et al.  Critical gap through clearing behavior of drivers at unsignalised intersections , 2011 .

[32]  Linda Ng Boyle,et al.  Extending the Technology Acceptance Model to assess automation , 2011, Cognition, Technology & Work.

[33]  Makoto Itoh,et al.  Driver's Trust in Automted Driving when Passing Other Traffic Objects , 2015, 2015 IEEE International Conference on Systems, Man, and Cybernetics.

[34]  L. Bainbridge Ironies of Automation , 1982 .