Comparing spatially static and dynamic vibrotactile take-over requests in the driver seat.

Vibrotactile stimuli can be effective as warning signals, but their effectiveness as directional take-over requests in automated driving is yet unknown. This study aimed to investigate the correct response rate, reaction times, and eye and head orientation for static versus dynamic directional take-over requests presented via vibrating motors in the driver seat. In a driving simulator, eighteen participants performed three sessions: 1) a session involving no driving (Baseline), 2) driving a highly automated car without additional task (HAD), and 3) driving a highly automated car while performing a mentally demanding task (N-Back). Per session, participants received four directional static (in the left or right part of the seat) and four dynamic (moving from one side towards the opposite left or right of the seat) take-over requests via two 6×4 motor matrices embedded in the seat back and bottom. In the Baseline condition, participants reported whether the cue was left or right, and in the HAD and N-Back conditions participants had to change lanes to the left or to the right according to the directional cue. The correct response rate was operationalized as the accuracy of the self-reported direction (Baseline session) and the accuracy of the lane change direction (HAD & N-Back sessions). The results showed that the correct response rate ranged between 94% for static patterns in the Baseline session and 74% for dynamic patterns in the N-Back session, although these effects were not statistically significant. Steering wheel touch and steering input reaction times were approximately 200ms faster for static patterns than for dynamic ones. Eye tracking results revealed a correspondence between head/eye-gaze direction and lane change direction, and showed that head and eye-gaze movements where initiated faster for static vibrations than for dynamic ones. In conclusion, vibrotactile stimuli presented via the driver seat are effective as warnings, but their effectiveness as directional take-over requests may be limited. The present study may encourage further investigation into how to get drivers safely back into the loop.

[1]  James R. Sayer,et al.  Assessment of a Driver Interface for Lateral Drift and Curve Speed Warning Systems: Mixed Results for Auditory and Haptic Warnings , 2017 .

[2]  Klaus Bengler,et al.  How Traffic Situations and Non-Driving Related Tasks Affect the Take-Over Quality in Highly Automated Driving , 2014 .

[3]  Klaus Bengler,et al.  Vibrotactile Displays: A Survey With a View on Highly Automated Driving , 2016, IEEE Transactions on Intelligent Transportation Systems.

[4]  Klaus Bengler,et al.  Taking Over Control From Highly Automated Vehicles in Complex Traffic Situations , 2016, Hum. Factors.

[5]  Natasha Merat,et al.  Highly Automated Driving, Secondary Task Performance, and Driver State , 2012, Hum. Factors.

[6]  Thomas Maier,et al.  Driver Support by a Vibrotactile Seat Matrix – Recognition, Adequacy and Workload of Tactile Patterns in Take-over Scenarios During Automated Driving☆ , 2015 .

[7]  Linda R. Elliott,et al.  A Meta-Analysis of Vibrotactile and Visual Information Displays for Improving Task Performance , 2012, IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews).

[8]  Martin Schrepp,et al.  Construction and Evaluation of a User Experience Questionnaire , 2008, USAB.

[9]  S. Bolanowski,et al.  A four-channel analysis of the tactile sensitivity of the fingertip: frequency selectivity, spatial summation, and temporal summation , 2002, Somatosensory & motor research.

[10]  P. Bach-y-Rita,et al.  Simultaneous and successive cutaneous two-point thresholds for vibration , 1969 .

[11]  Marco Dozza,et al.  Drivers anticipate lead-vehicle conflicts during automated longitudinal control: Sensory cues capture driver attention and promote appropriate and timely responses. , 2016, Accident; analysis and prevention.

[12]  J. G. Hollands,et al.  Engineering Psychology and Human Performance , 1984 .

[13]  Masaaki Kurosu Human-Computer Interaction. Advanced Interaction Modalities and Techniques , 2014, Lecture Notes in Computer Science.

[14]  Cristy Ho,et al.  Dynamic vibrotactile warning signals for frontal collision avoidance: towards the torso versus towards the head , 2015, Ergonomics.

[15]  Klaus Bengler,et al.  “Take over!” How long does it take to get the driver back into the loop? , 2013 .

[16]  Fanxing Meng,et al.  Dynamic Vibrotactile Signals for Forward Collision Avoidance Warning Systems , 2015, Hum. Factors.

[17]  Edgar Erdfelder,et al.  G*Power 3: A flexible statistical power analysis program for the social, behavioral, and biomedical sciences , 2007, Behavior research methods.

[18]  Klaus Bengler,et al.  Taking Over Control from Highly Automated Vehicles , 2014 .

[19]  Klaus Bengler,et al.  Take-over again: Investigating multimodal and directional TORs to get the driver back into the loop. , 2017, Applied ergonomics.

[20]  Lisanne Bainbridge,et al.  Ironies of automation , 1982, Autom..

[21]  John B. Morrell,et al.  Design and evaluation of a vibrotactile seat to improve spatial awareness while driving , 2010, 2010 IEEE Haptics Symposium.

[22]  Tal Oron-Gilad,et al.  Vibrotactile “On-Thigh” Alerting System in the Cockpit , 2011, Hum. Factors.

[23]  J.B.F. van Erp,et al.  Tactile Torso Display as Countermeasure to Reduce Night Vision Goggles Induced Drift , 2003 .

[24]  Dick de Waard,et al.  A simple procedure for the assessment of acceptance of advanced transport telematics , 1997 .

[25]  Sungjae Hwang,et al.  The Haptic steering Wheel: Vibro-tactile based navigation for the driving environment , 2010, 2010 8th IEEE International Conference on Pervasive Computing and Communications Workshops (PERCOM Workshops).

[26]  Markus Maurer,et al.  Rechtsfolgen zunehmender Fahrzeugautomatisierung , 2012 .

[27]  Frederik Diederichs,et al.  Take-Over Requests for Automated Driving , 2015 .

[28]  Victoria A Banks,et al.  Keep the driver in control: Automating automobiles of the future. , 2016, Applied ergonomics.

[29]  Lynette A. Jones,et al.  Tactile Displays: Guidance for Their Design and Application , 2008, Hum. Factors.

[30]  Per-Anders Oskarsson,et al.  Evaluation of Tactile Drift Displays in Helicopter , 2014, HCI.

[31]  Rob Gray,et al.  A Comparison of Different Informative Vibrotactile Forward Collision Warnings: Does the Warning Need to Be Linked to the Collision Event? , 2014, PloS one.

[32]  Hendrik A. H. C. van Veen,et al.  Waypoint navigation with a vibrotactile waist belt , 2005, TAP.

[33]  Klaus Bengler,et al.  A transforming steering wheel for highly automated cars , 2015, 2015 IEEE Intelligent Vehicles Symposium (IV).

[34]  Peter A Hancock,et al.  Improving target detection in visual search through the augmenting multi-sensory cues , 2013, Ergonomics.

[35]  Charles Spence,et al.  Tactile warning signals for in-vehicle systems. , 2015, Accident; analysis and prevention.

[36]  Kathrin Zeeb,et al.  What determines the take-over time? An integrated model approach of driver take-over after automated driving. , 2015, Accident; analysis and prevention.

[37]  John P. Wann,et al.  Why you should look where you are going , 2000, Nature Neuroscience.

[38]  Yusuke Nakamura,et al.  A genome-wide association study identifies two susceptibility loci for duodenal ulcer in the Japanese population , 2012, Nature Genetics.

[39]  Kathrin Zeeb,et al.  Is take-over time all that matters? The impact of visual-cognitive load on driver take-over quality after conditionally automated driving. , 2016, Accident; analysis and prevention.

[40]  Natasha Merat,et al.  Were they in the loop during automated driving? Links between visual attention and crash potential , 2016, Injury Prevention.

[41]  Hong Z. Tan,et al.  Using spatial vibrotactile cues to direct visual attention in driving scenes , 2005 .

[42]  Lutz Lorenz,et al.  Designing take over scenarios for automated driving , 2014 .

[43]  Christopher B. Mayhorn,et al.  Multimodal Cueing: The Relative Benefits of the Auditory, Visual, and Tactile Channels in Complex Environments , 2012 .