Reducing driver distraction by improving secondary task performance through multimodal touchscreen interaction

Methods of information presentation in the automotive space have been evolving continuously in recent years. As technology pushes forward the boundaries of what is possible, automobile manufacturers are trying to keep up with the current trends. Traditionally, the often-long development and quality control cycles of the automotive sector ensured slow yet steady progress. However, the exponential advancement in the mobile and hand-held computing space seen in the last 10 years has put immense pressure on automobile manufacturers to try to catch up. For this reason, we now see manufacturers trying to explore new techniques for in-vehicle interaction (IVI), which were ignored in the past. However, recent attempts have either simply extended the interaction model already used in mobile or handheld computing devices or increased visual-only presentation-of-information with limited expansion to other modalities (i.e. audio or haptics). This is also true for system interaction which generally happens within complex driving environments, making the primary task of a driver (driving) even more challenging. Essentially, there is an inherent need to design and research IVI systems that complement and natively support a multimodal interaction approach, providing all the necessary information without increasing driver’s cognitive load or at a bare minimum his/her visual load. In this research we focus on the key elements of IVI system: touchscreen interaction by developing prototype devices that can complement the conventional visual and auditory modalities in a simple and natural manner. Instead of adding primitive touch feedback cues to increase redundancy or complexity, we approach the issue by looking at the current requirements of interaction and complementing the existing system with natural and intuitive input and output methods, which are less affected by environmental noise than traditional multimodal systems.

[1]  Linda R. Elliott,et al.  Comparing the effects of visual-auditory and visual-tactile feedback on user performance: a meta-analysis , 2006, ICMI '06.

[2]  Michael J. Griffin,et al.  The Transmission of Vibration to the Occupants of a Car Seat with a Suspended Back-Rest , 1996 .

[3]  Lewis L. Chuang,et al.  Using EEG to Understand why Behavior to Auditory In-vehicle Notifications Differs Across Test Environments , 2017, AutomotiveUI.

[4]  Ahmed Farooq Developing technologies to provide haptic feedback for surface based interaction in mobile devices , 2017 .

[5]  Ju-Hwan Lee,et al.  Assessing the benefits of multimodal feedback on dual-task performance under demanding conditions , 2008, BCS HCI.

[6]  Grigori Evreinov,et al.  Developing novel multimodal interaction techniques for touchscreen in-vehicle infotainment systems , 2014, 2014 International Conference on Open Source Systems & Technologies.

[7]  Philippe Peigneux,et al.  Sleepiness at the wheel across Europe: a survey of 19 countries , 2015, Journal of sleep research.

[8]  Ahmad Rasdan Ismail,et al.  Comparative assessment of the whole-body vibration exposure under different car speed based on malaysian road profile , 2010 .

[9]  Linda Ng Boyle,et al.  The Interaction of Cognitive Load and Attention-Directing Cues in Driving , 2009, Hum. Factors.

[10]  Mansour Rahimi,et al.  Sensing Directionality in Tangential Haptic Stimulation , 2009, HCI.

[11]  Tal Oron-Gilad,et al.  The Effects of Continuous Driving-Related Feedback on Drivers’ Response to Automation Failures , 2017 .

[12]  Kyle Dunno Measurement and Analysis of Vehicle Vibration for Bottled Water Delivery Trucks , 2014 .

[13]  Roope Raisamo,et al.  Mobile devices as infotainment user interfaces in the car: contextual study and design implications , 2013, MobileHCI '13.

[14]  Lynne Baillie,et al.  A comparison of artificial driving sounds for automated vehicles , 2015, UbiComp.

[15]  Elisabeth André,et al.  Proceedings of the 22nd British HCI Group Annual Conference on People and Computers: Culture, Creativity, Interaction , 2008 .

[16]  Roope Raisamo,et al.  Responses to visual, tactile and visual–tactile forward collision warnings while gaze on and off the road , 2016 .

[17]  Roope Raisamo,et al.  Cold or Hot? How Thermal Stimuli Are Related to Human Emotional System? , 2013, HAID.

[18]  Rafał Burdzik,et al.  Research on structure, propagation and exposure to general vibration in passenger car for different damping parameters , 2013 .

[19]  R. Klatzky,et al.  Haptic perception: A tutorial , 2009, Attention, perception & psychophysics.

[20]  Michael Dean Tschirhart,et al.  User Experience in the U.S. and Germany of In-Vehicle Touch Screens with Integrated Haptic and Auditory Feedback , 2007 .

[21]  R J Sojourner,et al.  The Effects of a Simulated Head-Up Display Speedometer on Perceptual Task Performance , 1990, Human factors.

[22]  F. Cheung National Highway Traffic Safety Administration (NHTSA) notes. An analysis of alcohol-related motor vehicle fatalities by ethnicity. , 1999, Annals of emergency medicine.

[23]  Andreas Butz,et al.  HapTouch and the 2+1 state model: potentials of haptic feedback on touch based in-vehicle information systems , 2010, AutomotiveUI.

[24]  Göran Falkman,et al.  Presenting system uncertainty in automotive UIs for supporting trust calibration in autonomous driving , 2013, AutomotiveUI.

[25]  Marc O. Ernst,et al.  An instance of tactile suppression: Active exploration impairs tactile sensitivity for the direction of lateral movement , 2006 .

[26]  Stephen A. Brewster,et al.  Investigating the effectiveness of tactile feedback for mobile touchscreens , 2008, CHI.

[27]  Roope Raisamo,et al.  Emotional and behavioral responses to haptic stimulation , 2008, CHI.

[28]  Frank E. Pollick,et al.  To Beep or Not to Beep?: Comparing Abstract versus Language-Based Multimodal Driver Displays , 2015, CHI.

[29]  Tuomo Kujala,et al.  Visual Distraction Effects of In-Car Text Entry Methods: Comparing Keyboard, Handwriting and Voice Recognition , 2017, AutomotiveUI.

[30]  R. Johansson,et al.  Encoding of Direction of Fingertip Forces by Human Tactile Afferents , 2001, The Journal of Neuroscience.

[31]  Jun Rekimoto,et al.  Ambient touch: designing tactile interfaces for handheld devices , 2002, UIST '02.

[32]  Grigori Evreinov,et al.  Haptic user interface enhancement system for touchscreen based interaction , 2014, 2014 International Conference on Open Source Systems & Technologies.

[33]  Stephen Brewster,et al.  Feeling Rough: Multimodal Perception of Virtual Roughness , 2001 .

[34]  Markus Löchtefeld,et al.  Morphees: toward high "shape resolution" in self-actuated flexible mobile devices , 2013, CHI.