Turmoil Behind the Automated Wheel - An Embodied Perspective on Current HMI Developments in Partially Automated Vehicles

Cars that include combinations of automated functions, such as Adaptive Cruise Control (ACC) and Lane Keeping (LK), are becoming more and more available to consumers, and higher levels of automation are under development. In the use of these systems, the role of the driver is changing. This new interaction between the driver and the vehicle may result in several human factors problems if not sufficiently supported. These issues include driver distraction, loss of situational awareness and high workload during mode transitions. A large conceptual gap exists on how we can create safe, efficient and fluent interactions between the car and driver both during automation and mode transitions. This study looks at different HMIs from a new perspective: Embodied Interaction. The results of this study identify design spaces that are currently underutilized and may contribute to safe and fluent driver support systems in partially automated cars.

[1]  Neville A. Stanton,et al.  Effects of adaptive cruise control and highly automated driving on workload and situation awareness: A review of the empirical evidence , 2014 .

[2]  Torben Wallbaum,et al.  Comparing Shape-Changing and Vibro-Tactile Steering Wheels for Take-Over Requests in Highly Automated Driving , 2017, AutomotiveUI.

[3]  Daniel J. Fagnant,et al.  Preparing a Nation for Autonomous Vehicles: Opportunities, Barriers and Policy Recommendations , 2015 .

[4]  CassellJ.,et al.  More than just a pretty face , 2001 .

[5]  Paul C. Schutte,et al.  The H-Metaphor as a Guideline for Vehicle Automation and Interaction , 2005 .

[6]  Hao Yan,et al.  More than just a pretty face: affordances of embodiment , 2000, IUI '00.

[7]  Klaus Bengler,et al.  A transforming steering wheel for highly automated cars , 2015, 2015 IEEE Intelligent Vehicles Symposium (IV).

[8]  Daehan Wi,et al.  Vibratory Haptic feedback assistive device for visually-impaired drivers , 2017, 2017 IEEE SmartWorld, Ubiquitous Intelligence & Computing, Advanced & Trusted Computed, Scalable Computing & Communications, Cloud & Big Data Computing, Internet of People and Smart City Innovation (SmartWorld/SCALCOM/UIC/ATC/CBDCom/IOP/SCI).

[9]  A. Clark Embodiment and Explanation , 2008 .

[10]  Penelope Sarah Oliver,et al.  Check‐all‐that‐applies as an alternative for descriptive analysis to establish flavors driving liking in strawberries , 2018 .

[11]  Neville A Stanton,et al.  Driver behaviour with adaptive cruise control , 2005, Ergonomics.

[12]  Markus Zimmermann,et al.  A multimodal interaction concept for cooperative driving , 2013, 2013 IEEE Intelligent Vehicles Symposium (IV).

[13]  Chi Thanh Vi,et al.  What Did I Sniff?: Mapping Scents Onto Driving-Related Messages , 2017, AutomotiveUI.

[14]  J C F de Winter,et al.  Comparing spatially static and dynamic vibrotactile take-over requests in the driver seat. , 2017, Accident; analysis and prevention.

[15]  Klaus Bengler,et al.  Take-over again: Investigating multimodal and directional TORs to get the driver back into the loop. , 2017, Applied ergonomics.

[16]  Alexandra Neukum,et al.  A Human-Machine Interface for Cooperative Highly Automated Driving , 2017 .

[17]  Nicholas F. Maxemchuk,et al.  Highway Capacity Benefits from Using Vehicle-to-Vehicle Communication and Sensors for Collision Avoidance , 2011, 2011 IEEE Vehicular Technology Conference (VTC Fall).

[18]  Neville A. Stanton,et al.  Sub-systems on the road to vehicle automation: Hands and feet free but not 'mind' free driving , 2014 .

[19]  Mark Vollrath,et al.  Improving the Driver–Automation Interaction , 2013, Hum. Factors.

[20]  Tammy E. Trimble,et al.  Automated Vehicles: Take-Over Request and System Prompt Evaluation , 2016 .

[21]  S. Shankar Sastry,et al.  Towards trustworthy automation: User interfaces that convey internal and external awareness , 2016, 2016 IEEE 19th International Conference on Intelligent Transportation Systems (ITSC).

[22]  Caroline Hummels,et al.  Beyond distributed representation: embodied cognition design supporting socio-sensorimotor couplings , 2014, TEI '14.

[23]  Marieke Hendrikje Martens,et al.  Should I take over? Does system knowledge help drivers in making take-over decisions while driving a partially automated car? , 2019, Transportation Research Part F: Traffic Psychology and Behaviour.

[24]  Wendy Ju,et al.  Dashboard Design for an Autonomous Car , 2014, AutomotiveUI.

[25]  M R Endsley,et al.  Level of automation effects on performance, situation awareness and workload in a dynamic control task. , 1999, Ergonomics.

[26]  Natasha Merat,et al.  Highly Automated Driving, Secondary Task Performance, and Driver State , 2012, Hum. Factors.

[27]  Paul Dourish,et al.  Where the action is , 2001 .

[28]  Jinwoo Kim,et al.  "Are You Ready to Take-over?": An Exploratory Study on Visual Assistance to Enhance Driver Vigilance , 2017, CHI Extended Abstracts.

[29]  Kristina Höök,et al.  Hand in hand with the material: designing for suppleness , 2010, CHI.

[30]  Susanne Boll,et al.  Assisting Drivers with Ambient Take-Over Requests in Highly Automated Driving , 2016, AutomotiveUI.

[31]  Lutz Lorenz,et al.  Designing take over scenarios for automated driving , 2014 .

[32]  Jean-Marc Sieffermann,et al.  Evaluation and optimization of a vibrotactile signal in an autonomous driving context , 2018 .

[33]  Thomas A. Furness,et al.  The Effects of the Interface on Navigation in Virtual Environments , 1998 .

[34]  Manfred Tscheligi,et al.  Control Transition Interfaces in Semiautonomous Vehicles: A Categorization Framework and Literature Analysis , 2017, AutomotiveUI.

[35]  E. D. Paolo,et al.  Participatory sense-making , 2007 .

[36]  Mascha C. van der Voort,et al.  Supporting the Changing Driver’s Task: Exploration of Interface Designs for Supervision and Intervention in Automated Driving , 2016 .

[37]  Jack M. Loomis,et al.  Locomotion Mode Affects the Updating of Objects Encountered During Travel: The Contribution of Vestibular and Proprioceptive Inputs to Path Integration , 1998, Presence.

[38]  Natasha Merat,et al.  CityMobil : Human factor issues regarding highly automated vehicles on eLane , 2009 .

[39]  Ryuji Funayama,et al.  Sociable driving agents to maintain driver's attention in autonomous driving , 2017, 2017 26th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN).

[40]  Frederik Diederichs,et al.  Take-Over Requests for Automated Driving , 2015 .

[41]  Ipke Wachsmuth,et al.  Introduction to embodied communication: Why communication needs the body , 2008 .

[42]  Alexandra Neukum,et al.  The effect of urgency of take-over requests during highly automated driving under distraction conditions , 2014 .

[43]  Kristina Höök,et al.  Supple interfaces: designing and evaluating for richer human connections and experiences , 2007, CHI Extended Abstracts.

[44]  Michael L. Anderson Embodied Cognition: A field guide , 2003, Artif. Intell..

[45]  S. Lewandowsky,et al.  A connectionist model of complacency and adaptive recovery under automation. , 2000, Journal of experimental psychology. Learning, memory, and cognition.

[46]  Klaus Bengler,et al.  The H-Metaphor as an Example for Cooperative Vehicle Driving , 2011, HCI.

[47]  Frank E. Pollick,et al.  Using Multimodal Displays to Signify Critical Handovers of Control to Distracted Autonomous Car Drivers , 2017, Int. J. Mob. Hum. Comput. Interact..

[48]  Shamsi T. Iqbal,et al.  Priming Drivers before Handover in Semi-Autonomous Cars , 2017, CHI.

[49]  Kevin Lee,et al.  “Driver Take Over”: A Preliminary Exploration of Driver Trust and Performance in Autonomous Vehicles , 2017 .

[50]  Dick de Waard,et al.  Behavioural impacts of Advanced Driver Assistance Systems - an overview. , 2001 .

[51]  Omer Tsimhoni,et al.  Haptic seat for automated driving: preparing the driver to take control effectively , 2015, AutomotiveUI.

[52]  Göran Falkman,et al.  Presenting system uncertainty in automotive UIs for supporting trust calibration in autonomous driving , 2013, AutomotiveUI.

[53]  Kristina Höök,et al.  On being supple: in search of rigor without rigidity in meeting new design and evaluation challenges for HCI practitioners , 2009, CHI.

[54]  Boussaad Soualmi,et al.  Augmented reality versus classical HUD to take over from automated driving: An aid to smooth reactions and to anticipate maneuvers , 2016, 2016 IEEE 19th International Conference on Intelligent Transportation Systems (ITSC).

[55]  Joseph Kaye,et al.  Making Scents: aromatic output for HCI , 2004, INTR.

[56]  Michael A. Nees,et al.  Speech Auditory Alerts Promote Memory for Alerted Events in a Video-Simulated Self-Driving Car Ride , 2016, Hum. Factors.

[57]  R. Happee,et al.  Automated Driving: Human-Factors Issues and Design Solutions , 2012 .

[58]  Johannes Kraus,et al.  Elaborating Feedback Strategies for Maintaining Automation in Highly Automated Driving , 2016, AutomotiveUI.

[59]  Arie P. van den Beukel,et al.  The road to automated driving: Dual mode and human factors considerations , 2013, 16th International IEEE Conference on Intelligent Transportation Systems (ITSC 2013).

[60]  Oliver M. J. Carsten,et al.  How can humans understand their automated cars? HMI principles, problems and solutions , 2019, Cognition, Technology & Work.

[61]  B. Wee,et al.  The Transport System and Transport Policy: An Introduction , 2013 .

[62]  Klaus Bengler,et al.  “Take over!” How long does it take to get the driver back into the loop? , 2013 .

[63]  Wendy Ju,et al.  Why did my car just do that? Explaining semi-autonomous driving actions to improve driver understanding, trust, and performance , 2014, International Journal on Interactive Design and Manufacturing (IJIDeM).

[64]  Michael Weber,et al.  Autonomous driving: investigating the feasibility of car-driver handover assistance , 2015, AutomotiveUI.