May the Force Be with You: Ultrasound Haptic Feedback for Mid-Air Gesture Interaction in Cars

The use of ultrasound haptic feedback for mid-air gestures in cars has been proposed to provide a sense of control over the user's intended actions and to add touch to a touchless interaction. However, the impact of ultrasound feedback to the gesturing hand regarding lane deviation, eyes-off-the-road time (EORT) and perceived mental demand has not yet been measured. This paper investigates the impact of uni- and multimodal presentation of ultrasound feedback on the primary driving task and the secondary gesturing task in a simulated driving environment. The multimodal combinations of ultrasound included visual, auditory, and peripheral lights. We found that ultrasound feedback presented unimodally and bimodally resulted in significantly less EORT compared to visual feedback. Our results suggest that multimodal ultrasound feedback for mid-air interaction decreases EORT whilst not compromising driving performance nor mental demand and thus can increase safety while driving.

[1]  Stephen A. Brewster,et al.  Do That, There: An Interaction Technique for Addressing In-Air Gesture Systems , 2016, CHI.

[2]  Paul Green,et al.  THE 15-SECOND RULE FOR DRIVER INFORMATION SYSTEMS , 1999 .

[3]  Paul Green INTERNATIONAL INCIDENTS: CRASHES INDUCED BY DRIVER INFORMATION SYSTEMS AND WHAT CAN BE DONE TO REDUCE THEM. , 2000 .

[4]  Andreas Butz,et al.  Culturally Independent Gestures for In-Car Interactions , 2013, INTERACT.

[5]  Stephen A. Brewster,et al.  Bimodal feedback for in-car mid-air gesture interaction , 2017, ICMI.

[6]  Chi Thanh Vi,et al.  Agency in Mid-air Interfaces , 2017, CHI.

[7]  Manfred Tscheligi,et al.  Ambient Light and its Influence on Driving Experience , 2017, AutomotiveUI.

[8]  Micah Alpern,et al.  Developing a car gesture interface for use as a secondary task , 2003, CHI Extended Abstracts.

[9]  Paul Green,et al.  Driver Distraction, Telematics Design, and Workload Managers: Safety Issues and Solutions , 2004 .

[10]  Hong Z. Tan,et al.  Using spatial vibrotactile cues to direct visual attention in driving scenes , 2005 .

[11]  Linda Ng Boyle,et al.  Differences in Off-Road Glances: Effects on Young Drivers’ Performance , 2010 .

[12]  Linda Ng Boyle,et al.  Driver's lane keeping ability with eyes off road: Insights from a naturalistic study. , 2013, Accident; analysis and prevention.

[13]  Susanne Boll,et al.  An experiment on ambient light patterns to support lane change decisions , 2015, 2015 IEEE Intelligent Vehicles Symposium (IV).

[14]  J. V. Erp,et al.  Vibrotactile in-vehicle navigation system , 2004 .

[15]  Sonja Rümelin,et al.  Clicks are in the Air: How to Support the Interaction with Floating Objects through Ultrasonic Feedback , 2017, AutomotiveUI.

[16]  Bettina Abendroth,et al.  Possible applications for gestures while driving , 2018 .

[17]  Susanne Boll,et al.  Sparkle: an ambient light display for dynamic off-screen points of interest , 2014, NordiCHI.

[18]  Sriram Subramanian,et al.  Rendering volumetric haptic shapes in mid-air using ultrasound , 2014, ACM Trans. Graph..

[19]  Bruce N. Walker,et al.  Applying Popular Usability Heuristics to Gesture Interaction in the Vehicle , 2014, AutomotiveUI.

[20]  Myounghoon Jeon,et al.  Development Tool for Rapid Evaluation of Eyes-free In-vehicle Gesture Controls , 2016, AutomotiveUI.

[21]  Nadine B. Sarter,et al.  Peripheral Visual Feedback: A Powerful Means of Supporting Effective Attention Allocation in Event-Driven, Data-Rich Environments , 2001, Hum. Factors.

[22]  Yu Zhang,et al.  Pointing Towards Future Automotive HMIs: The Potential for Gesture Interaction , 2014, AutomotiveUI.

[23]  J. Jonides Voluntary versus automatic control over the mind's eye's movement , 1981 .

[24]  J. Theeuwes Exogenous and endogenous control of attention: The effect of visual onsets and offsets , 1991, Perception & psychophysics.

[25]  Ioannis Politis,et al.  The Effects of Modality, Urgency and Message Content on Responses to Multimodal Driver Displays , 2014 .

[26]  Thomas A. Dingus,et al.  The Impact of Driver Inattention on Near-Crash/Crash Risk: An Analysis Using the 100-Car Naturalistic Driving Study Data , 2006 .

[27]  Stephen A. Brewster,et al.  Interactive Light Feedback: Illuminating Above-Device Gesture Interfaces , 2015, INTERACT.

[28]  Pourang Irani,et al.  Consumed endurance: a metric to quantify arm fatigue of mid-air interactions , 2014, CHI.

[29]  Alistair A. Young,et al.  Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) , 2017, MICCAI 2017.

[30]  Tapio Lokki,et al.  Eyes-free interaction with free-hand gestures and auditory menus , 2012, Int. J. Hum. Comput. Stud..

[31]  Sarah Morrison-Smith,et al.  Using audio cues to support motion gesture interaction on mobile devices , 2014, CHI Extended Abstracts.

[32]  N. Cowan The magical number 4 in short-term memory: A reconsideration of mental storage capacity , 2001, Behavioral and Brain Sciences.

[33]  Stephen A. Brewster,et al.  Novel Multimodal Feedback Techniques for In-Car Mid-Air Gesture Interaction , 2017, AutomotiveUI.

[34]  Orestis Georgiou,et al.  Haptic In-Vehicle Gesture Controls , 2017, AutomotiveUI.

[35]  Ju-Hwan Lee,et al.  Assessing the benefits of multimodal feedback on dual-task performance under demanding conditions , 2008, BCS HCI.

[36]  Alois Ferscha,et al.  Standardization of the in-car gesture interaction space , 2013, AutomotiveUI.

[37]  Steven Landry,et al.  Towards an in-vehicle sonically-enhanced gesture control interface: A pilot study , 2016, ICAD 2016.

[38]  Stephen A. Brewster,et al.  Tactile Feedback for Above-Device Gesture Interfaces: Adding Touch to Touchless Interactions , 2014, ICMI.

[39]  Sriram Subramanian,et al.  UltraHaptics: multi-point mid-air haptic feedback for touch surfaces , 2013, UIST.

[40]  Bruce N. Walker,et al.  A Multimodal Air Gesture Interface for In Vehicle Menu Navigation , 2014, AutomotiveUI.

[41]  Lee Skrypchuk,et al.  Evaluating User Response to In-Car Haptic Feedback Touchscreens Using the Lane Change Test , 2012, Adv. Hum. Comput. Interact..

[42]  Susanne Boll,et al.  AutoAmbiCar: Using Ambient Light to Inform Drivers About Intentions of Their Automated Cars , 2016, AutomotiveUI.

[43]  Bryan Reimer,et al.  The Effects of a Production Level "Voice-Command" Interface on Driver Behavior: Summary Findings on Reported Workload, Physiology, Visual Attention, and Driving Performance , 2013 .

[44]  E. Ancman Peripherally located CRTs: color perception limitations , 1991, Proceedings of the IEEE 1991 National Aerospace and Electronics Conference NAECON 1991.