T9+HUD: Physical Keypad and HUD can Improve Driving Performance while Typing and Driving

We introduce T9+HUD, a text entry method designed to decrease visual distraction while driving and typing. T9+HUD combines a physical 3x4 keypad on the steering wheel with a head-up-display (HUD) for projecting output on the windshield. Previous work suggests this may be a visually less demanding way to type while driving than the popular case which requires shifts of visual attention away from the road. We present a prototype design and report first results from a controlled evaluation in a driving simulator. While driving, the T9+HUD text entry rate was equal compared to a dashboard-mounted touchscreen device, but it reduced lane deviations by 70%. Furthermore, there was no significant difference between T9+HUD and baseline driving in lane-keeping performance. T9+HUD decreased glance time off road by 64% in comparison to the touchscreen QWERTY. We conclude that the data are favorable and warrant more research on attention-reducing text input methods for driving.

[1]  Omer Tsimhoni,et al.  Address Entry While Driving: Speech Recognition Versus a Touch-Screen Keyboard , 2004, Hum. Factors.

[2]  Albrecht Schmidt,et al.  Writing to your car: handwritten text input while driving , 2009, CHI Extended Abstracts.

[3]  Christopher D. Wickens,et al.  Costs and Benefits of Head up Displays: An Attention Perspective and a Meta Analysis , 2000 .

[4]  Brant C. White,et al.  United States patent , 1985 .

[5]  Tuomo Kujala,et al.  Visual-manual in-car tasks decomposed: text entry and kinetic scrolling as the main sources of visual distraction , 2013, AutomotiveUI.

[6]  Bryan E. Porter,et al.  Handbook of Traffic Psychology , 2011 .

[7]  Brad A. Myers,et al.  An alternative to push, press, and tap-tap-tap: gesturing on an isometric joystick for mobile phone text entry , 2007, CHI.

[8]  Lawrence J. Prinzel,et al.  Head-Up Displays and Attention Capture , 2004 .

[9]  Paul Green,et al.  THE 15-SECOND RULE FOR DRIVER INFORMATION SYSTEMS , 1999 .

[10]  Joanna Bergstrom-Lehtovirta,et al.  A simple index for multimodal flexibility , 2010, CHI.

[11]  Manfred Tscheligi,et al.  Exploring the back of the steering wheel: text input with hands on the wheel and eyes on the road , 2012, AutomotiveUI.

[12]  Brad A. Myers,et al.  Eyes on the road, hands on the wheel: thumb-based interaction techniques for input on steering wheels , 2007, GI '07.

[13]  David G. Kidd,et al.  Multi-modal assessment of on-road demand of voice and manual phone calling and voice navigation entry across two embedded vehicle systems , 2015, Ergonomics.

[14]  Christopher D. Wickens,et al.  The Effects of Head-Up Display Clutter and In-Vehicle Display Separation on Concurrent Driving Performance , 2003 .

[15]  Johan Engström,et al.  Sensitivity of eye-movement measures to in-vehicle task difficulty , 2005 .

[16]  Ming-Hui Wen,et al.  Comparison of head-up display (HUD) vs. head-down display (HDD): driving performance of commercial vehicle operators in Taiwan , 2004, Int. J. Hum. Comput. Stud..

[17]  Arne Jönsson,et al.  Eyes on the Road. , 1941, The British journal of ophthalmology.

[18]  David J. Ward,et al.  Fast Hands-free Writing by Gaze Direction , 2002, ArXiv.

[19]  I. Scott MacKenzie,et al.  Predicting text entry speed on mobile phones , 2000, CHI.

[20]  Y C Liu,et al.  Comparative study of the effects of auditory, visual and multimodality displays on drivers' performance in advanced traveller information systems , 2001, Ergonomics.

[21]  Klaus Bengler,et al.  Eye Gaze Studies Comparing Head-Up and Head-Down Displays in Vehicles , 2007, 2007 IEEE International Conference on Multimedia and Expo.

[22]  Richard J. Hanowski,et al.  Driver Distraction in Commercial Vehicle Operations , 2009 .

[23]  Per Ola Kristensson,et al.  A versatile dataset for text entry evaluations based on genuine mobile emails , 2011, Mobile HCI.

[24]  Andreas Butz,et al.  Contact-analog warnings on windshield displays promote monitoring the road scene , 2015, AutomotiveUI.

[25]  Christina L. James,et al.  Text input for mobile devices: comparing model prediction to actual performance , 2001, CHI.

[26]  Yung-ching Liu Effects of using head-up display in automobile context on attention demand and driving performance , 2003 .

[27]  Omer Tsimhoni,et al.  Visual Demand of Driving and the Execution of Display-Intensive in-Vehicle Tasks , 2001 .

[28]  I. Scott MacKenzie,et al.  Measuring errors in text entry tasks: an application of the Levenshtein string distance statistic , 2001, CHI Extended Abstracts.

[29]  H Summala,et al.  Driving experience and perception of the lead car's braking when looking at in-car targets. , 1998, Accident; analysis and prevention.

[30]  Manfred Tscheligi,et al.  Are 5 buttons enough: destination input on touchscreen keyboards , 2012, AutomotiveUI.

[31]  Frank Drews,et al.  Text Messaging During Simulated Driving , 2009, Hum. Factors.

[32]  Paul Green VISUAL AND TASK DEMANDS OF DRIVER INFORMATION SYSTEMS , 1999 .

[33]  Jacques Bergeron,et al.  Monotony of road environment and driver fatigue: a simulator study. , 2003, Accident; analysis and prevention.

[34]  S. Hart,et al.  Development of NASA-TLX (Task Load Index): Results of Empirical and Theoretical Research , 1988 .