How to make large touch screens usable while driving

Large touch screens are recently appearing in the automotive market, yet their usability while driving is still controversial. Flat screens do not provide haptic guidance and thus require visual attention to locate interactive elements that are displayed. Thus, we need to think about new concepts to minimize the visual attention needed for interaction, to keep the driver's focus on the road and ensure safety. In this paper, we explore three different approaches. The first one is designed to make use of proprioception. The second approach incorporates physical handles to ease orientation on a large flat surface. In the third approach, directional touch gestures are applied. We describe the results of a comparative study that investigates the required visual attention as well as task performance and perceived usability, in comparison to a state-of-the-art multifunctional controller. We found that direct touch buttons provide the best results regarding task completion time, but with a size of about 6x8 cm, they were not yet large enough for blind interaction. Physical elements in and around the screen space were regarded useful to ease orientation. With touch gestures, participants were able to reduce visual attention to a lower level than with the remote controller. Considering our findings, we argue that there are ways to make large screens more appropriate for in-car usage and thus harness the advantages they provide in other aspects.

[1]  Herbert Colle,et al.  Standing at a kiosk: Effects of key size and spacing on touch screen numeric keypad performance and user preference , 2004, Ergonomics.

[2]  I. Scott MacKenzie,et al.  Copyright 2009 by Human Factors and Ergonomics Society, Inc. All rights reserved. 10.1518/107118109X12524443347715 , 2009 .

[3]  Toshiaki Sugimura,et al.  Active click: tactile feedback for touch panels , 2001, CHI Extended Abstracts.

[4]  Stephen A. Brewster,et al.  Gestural and audio metaphors as a means of control for mobile devices , 2002, CHI.

[5]  Ali Israr,et al.  TeslaTouch: electrovibration for touch surfaces , 2010, UIST.

[6]  P. Fitts The information capacity of the human motor system in controlling the amplitude of movement. , 1954, Journal of experimental psychology.

[7]  Andreas Butz,et al.  HapTouch and the 2+1 state model: potentials of haptic feedback on touch based in-vehicle information systems , 2010, AutomotiveUI.

[8]  Mikael B. Skov,et al.  You can touch, but you can't look: interacting with in-vehicle systems , 2008, CHI.

[9]  Andreas Butz,et al.  Comparing Direct and Remote Tactile Feedback on Interactive Surfaces , 2012, EuroHaptics.

[10]  Albrecht Schmidt,et al.  Gestural interaction on the steering wheel: reducing the visual demand , 2011, CHI.

[11]  Andreas Butz,et al.  pieTouch: a direct touch gesture interface for interacting with in-vehicle information systems , 2009, Mobile HCI.

[12]  Neville A Stanton,et al.  To twist or poke? A method for identifying usability issues with the rotary controller and touch screen for control of in-vehicle information systems , 2011, Ergonomics.

[13]  Andreas Butz,et al.  Visual cues supporting direct touch gesture interaction with in-vehicle information systems , 2010, AutomotiveUI.

[14]  B. Thomas,et al.  Usability Evaluation In Industry , 1996 .

[15]  J. B. Brooke,et al.  SUS: A 'Quick and Dirty' Usability Scale , 1996 .

[16]  Martin Pielot,et al.  PocketMenu: non-visual menus for touch screen devices , 2012, Mobile HCI.

[17]  Klaus Bengler,et al.  Driving Simulator as an Evaluation Tool – Assessment of the Influence of Field of View andSecondary Tasks on Lane Keeping and Steering Performance , 2007 .

[18]  Patrick Baudisch,et al.  Interacting with large displays , 2006, Computer.

[19]  M. Weiser,et al.  An empirical comparison of pie vs. linear menus , 1988, CHI '88.

[20]  S. Hart,et al.  Development of NASA-TLX (Task Load Index): Results of Empirical and Theoretical Research , 1988 .

[21]  Tonya L. Smith-Jackson,et al.  Touch-screens are not tangible: fusing tangible interaction with touch glass in readers for the blind , 2013, TEI '13.

[22]  Y Ian Noy,et al.  Task interruptability and duration as measures of visual distraction. , 2004, Applied ergonomics.

[23]  Roy Want,et al.  Implementing phicons: combining computer vision with infrared technology for interactive physical icons , 1999, UIST '99.

[24]  Hanna Koskinen,et al.  Hands-on the process control: users preferences and associations on hand movements , 2008, CHI Extended Abstracts.

[25]  Hiroshi Ishii,et al.  Tangible bits: beyond pixels , 2008, TEI.

[26]  Andreas Butz,et al.  Up And Down And Along: How We Interact With Curvature , 2013 .

[27]  Khai N. Truong,et al.  Leveraging proprioception to make mobile phones more accessible to users with visual impairments , 2010, ASSETS '10.

[28]  P. Hancock,et al.  Human Mental Workload , 1988 .