Considerations for the Development of Non-Visual Interfaces for Driving Applications

While haptics, tactile displays, and other topics relating to non-visual user interfaces have been the subject of a variety of research initiatives, little has been done specifically related to those for blind driving. Many technologies have been developed for the purpose of assisting and improving the safety of sighted drivers, but to enable a true driving experience without any sense of sight has been an essentially overlooked area of study. Since 2005, the Robotics & Mechanisms Laboratory at Virginia Tech has assumed the task of developing non-visual interfaces for driving through the Blind Driver Challenge®, a project funded by the National Federation of the Blind. The objective here is not to develop a vehicle that will autonomously mobilize blind people, but to develop a vehicle that a blind person can actively and independently operate based on information communicated by non-visual interfaces. This thesis proposes some generalized considerations for the development of non-visual interfaces for driving, using the instructional interfaces developed for the Blind Driver Challenge® as a case study. A model is suggested for the function of blind driving as an openloop control system, wherein the human is an input/output device. Further, a discussion is presented on the relationship between the bandwidth of information communicated to the driver, the amount of human decision-making involved in blind driving, and the cultivation of driver independence. The considerations proposed here are intended to apply generally to the process of non-visual interface development for driving, enabling efficient concept generation and evaluation.

[1]  William Harwin,et al.  A high bandwidth interface for haptic human computer interaction , 2001 .

[2]  L. A. Jeffress,et al.  Localization of High‐Frequency Tones , 1957 .

[3]  P ? ? ? ? ? ? ? % ? ? ? ? , 1991 .

[4]  Barry Hampshire Working with braille : a study of braille as a medium of communication , 1981 .

[5]  Robert E. Fenton,et al.  An Intervehicular Spacing Display for Improved Car-Following Performance , 1968 .

[6]  Bahram Ravani,et al.  Improvement of a Human-Machine Interface (HMI) for Driver Assistance Using an Event-Driven Prompting Display , 2011, IEEE Transactions on Control Systems Technology.

[7]  P. Bach-y-Rita,et al.  Pattern identification on a fingertip-scanned electrotactile display , 1997, Proceedings of the 19th Annual International Conference of the IEEE Engineering in Medicine and Biology Society. 'Magnificent Milestones and Emerging Opportunities in Medical Engineering' (Cat. No.97CH36136).

[8]  D. Hartline,et al.  Rapid Conduction and the Evolution of Giant Axons and Myelinated Fibers , 2007, Current Biology.

[9]  K. Kaczmarek,et al.  Pattern identification and perceived stimulus quality as a function of stimulation waveform on a fingertip-scanned electrotactile display , 2003, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[10]  John A. Gardner,et al.  Tactile graphics: an overview and resource guide , 1996 .

[11]  Peter King,et al.  Odin: Team VictorTango's entry in the DARPA Urban Challenge , 2008, J. Field Robotics.

[12]  Tobias J. Hagge,et al.  Physics , 1929, Nature.

[13]  J. Moran,et al.  Sensation and perception , 1980 .

[14]  Han-Shue Tan,et al.  Preliminary Findings for a Lane-Keeping and Collision-Warning Driver Interface for Snowplow Operations , 1999 .

[15]  Han-Shue Tan,et al.  DEVELOPMENT OF A DRIVER ASSIST INTERFACE FOR SNOWPLOWS USING ITERATIVE DESIGN , 2000 .

[16]  Stephen Brewster,et al.  Nonspeech auditory output , 2002 .

[17]  F. Vidal-Verdu,et al.  Graphical Tactile Displays for Visually-Impaired People , 2007, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[18]  Antonio Bicchi,et al.  The Sense of Touch and its Rendering - Progress in Haptics Research , 2008 .

[19]  Max Donath,et al.  A Heads Up Display based on a DGPS and Real Time Accessible Geo-Spatial Database for Low Visibility Driving , 1999 .

[20]  Walter D. Potter,et al.  Vibrotactile Glove guidance for semi-autonomous wheelchair operations , 2008, ACM-SE 46.

[21]  Joerg Fricke,et al.  Design of a tactile graphic I/O tablet and its integration into a personal computer system for blind users , 1993 .

[22]  T. Landauer,et al.  Handbook of Human-Computer Interaction , 1997 .

[23]  J. Blauert Spatial Hearing: The Psychophysics of Human Sound Localization , 1983 .

[24]  Dennis W. Hong,et al.  Development of a semi-autonomous vehicle operable by the visually-impaired , 2008, 2008 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems.

[25]  C. F. Hockett,et al.  The World's Writing Systems , 1997 .

[26]  L. Kay,et al.  Electronic Aids for Blind Persons : an Interdisciplinary Subject , 1984 .

[27]  Mark Paterson,et al.  The Senses of Touch: Haptics, Affects and Technologies , 2007 .

[28]  P. Cochat,et al.  Et al , 2008, Archives de pediatrie : organe officiel de la Societe francaise de pediatrie.

[29]  W. Bright,et al.  The World's Writing Systems , 1997 .

[30]  Lorna M. Brown,et al.  Tactons: Structured Tactile Messages for Non-Visual Information Display , 2004, AUIC.

[31]  R. E. Fenton An Improved Man-Machine Interface for the Driver-Vehicle System , 1966 .

[32]  Vincent Hayward,et al.  Haptic interfaces and devices , 2004 .

[33]  Jonathan C. Roberts,et al.  Review of Designs for Haptic Data Visualization , 2010, IEEE Transactions on Haptics.

[34]  Akio Yamamoto,et al.  Control of thermal tactile display based on prediction of contact temperature , 2004, IEEE International Conference on Robotics and Automation, 2004. Proceedings. ICRA '04. 2004.

[35]  Kenneth E. Barner,et al.  HAPTIC REPRESENTATION OF SCIENTIFIC DATA FOR VISUALLY IMPAIRED OR BLIND PERSONS , 1996 .

[36]  K. Kaczmarek,et al.  Electrotactile adaptation on the abdomen: preliminary results. , 2000, IEEE transactions on rehabilitation engineering : a publication of the IEEE Engineering in Medicine and Biology Society.