SWAN: System for Wearable Audio Navigation

Wearable computers can certainly support audio-only presentation of information; a visual interface need not be present for effective user interaction. A system for wearable audio navigation (SWAN) is being developed to serve as a navigation and orientation aid for persons temporarily or permanently visually impaired. SWAN is a wearable computer consisting of audio-only output and tactile input via a handheld interface. SWAN aids a user in safe pedestrian navigation and includes the ability for the user to author new GIS data relevant to their needs of wayfinding, obstacle avoidance, and situational awareness support. Emphasis is placed on representing pertinent data with non-speech sounds through a process of sonification. SWAN relies on a geographic information system (GIS) infrastructure for supporting geocoding and spatialization of data. Furthermore, SWAN utilizes novel tracking technology.

[1]  Bruce N. Walker,et al.  Effect of Beacon Sounds on Navigation Performance in a Virtual Reality Environment , 2003 .

[2]  Raymond M. Stanley,et al.  Toward adapting spatial audio displays for use with bone conduction: the cancellation of bone-conducted and air-conducted sound waves. , 2006 .

[3]  James R. Marston,et al.  Cognitive load of navigating without vision when guided by virtual sound versus spatial language. , 2006, Journal of experimental psychology. Applied.

[4]  Wolfram Burgard,et al.  Monte Carlo localization for mobile robots , 1999, Proceedings 1999 IEEE International Conference on Robotics and Automation (Cat. No.99CH36288C).

[5]  Roberta L. Klatzky,et al.  A Geographical Information System for a GPS Based Personal Guidance System , 1998, Int. J. Geogr. Inf. Sci..

[6]  Frank Dellaert,et al.  A Multi-Camera Pose Tracker for Assisting the Visually Impaired , 2005, 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05) - Workshops.

[7]  D. Guth,et al.  Veering by Blind Pedestrians: Individual Differences and Their Implications for Instruction , 1995 .

[8]  C. Wickens Engineering psychology and human performance, 2nd ed. , 1992 .

[9]  Rina Dechter,et al.  Generalized best-first search strategies and the optimality of A* , 1985, JACM.

[10]  Bruce N. Walker,et al.  Navigation Performance With a Virtual Auditory Display: Effects of Beacon Sound, Capture Radius, and Practice , 2006, Hum. Factors.

[11]  Bruce N. Walker,et al.  The Effect of a Speech Discrimination Task on Navigation in a Virtual Environment , 2006 .

[12]  J. G. Hollands,et al.  Engineering Psychology and Human Performance , 1984 .

[13]  Bruce N. Walker,et al.  SPEARCONS: SPEECH-BASED EARCONS IMPROVE NAVIGATION PERFORMANCE IN AUDITORY MENUS , 2006 .

[14]  Roberta L. Klatzky,et al.  Evaluation of spatial displays for navigation without sight , 2006, TAP.

[15]  R L Klatzky,et al.  Navigating without vision: basic and applied research. , 2001, Optometry and vision science : official publication of the American Academy of Optometry.

[16]  G. Mowbray Simultaneous vision and audition: the comprehension of prose passages with varying levels of difficulty. , 1953, Journal of experimental psychology.

[17]  Chris Schmandt,et al.  Nomadic radio: scaleable and contextual notification for wearable audio messaging , 1999, CHI '99.

[18]  Abdelsalam Helal,et al.  Drishti: an integrated navigation system for visually impaired and disabled , 2001, Proceedings Fifth International Symposium on Wearable Computers.

[19]  Roberta L. Klatzky,et al.  Personal guidance system for the visually impaired , 1994, ASSETS.

[20]  David A. Ross,et al.  Implementing Assistive Technology on Wearable Computers , 2001, IEEE Intell. Syst..

[21]  R. Welsh Foundations of Orientation and Mobility , 1979 .

[22]  Leo Dorst,et al.  Differential A* , 2002, IEEE Trans. Knowl. Data Eng..

[23]  Stephen A. Brewster,et al.  An evaluation of earcons for use in auditory human-computer interfaces , 1993, INTERCHI.

[24]  Gordon E Legge,et al.  Variability in stepping direction explains the veering behavior of blind walkers. , 2007, Journal of experimental psychology. Human perception and performance.

[25]  David Filliat,et al.  Map-based navigation in mobile robots: II. A review of map-learning and path-planning strategies , 2003, Cognitive Systems Research.

[26]  T Letowski,et al.  Evaluation of acoustic beacon characteristics for navigation tasks , 2000, Ergonomics.

[27]  Roberta L. Klatzky,et al.  Stated Preferences for Components of a Personal Guidance System for Nonvisual Navigation , 2004 .

[28]  Mowbray Gh Simultaneous vision and audition: the comprehension of prose passages with varying levels of difficulty. , 1953 .

[29]  Helen Petrie,et al.  Development of dialogue systems for a mobility aid for blind people: initial design and usability testing , 1996, Assets '96.

[30]  Ian Lane Davis Warp Speed: Path Planning for Star Trek®: Armada , 2000 .

[31]  Bruce N. Walker,et al.  Auditory Navigation Performance is Affected by Waypoint Capture Radius , 2004, ICAD.

[32]  American Foundation for the Blind , 1967 .

[33]  Raymond M. Stanley,et al.  THRESHOLDS OF AUDIBILITY FOR BONE-CONDUCTION HEADSETS , 2005 .

[34]  Frank Dellaert,et al.  Map-based priors for localization , 2004, 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566).

[35]  Juha Marila Experimental comparison of complex and simple sounds in menu and hierarchy sonification , 2002 .