A navigation assistant based on a tactile-acoustical interface and augmented map information is presented, affording blind people real and virtual explorations of the 2007 CSUN Conference environment. By tapping on a touch screen, hotel layout and conference-related data are provided. Introduction Negotiating new environments can be challenging for all of us, but blind people face far greater navigation and orientation difficulties in such situations. At conferences, for example, the environment must be learned quickly and may even change from day to day. Information about the location of meeting rooms, restrooms, lunch and break areas, booths, company representatives, and products is especially difficult to obtain. This author's experience at the last CSUN Conference stimulated the development of a specialized application of our electronic Tactile-Acoustical Navigation and Information Assistant (TANIA) system to this year's meeting. Unlike commercially available navigation systems, which are usually inoperable indoors without installation of a time and/or cost intensive signal or marker infrastructure, the TANIA system does not require infrastructure. It provides indoor navigation support for blind and visually-impaired people based on a step-recognition method and simple building maps. These maps have been augmented with additional information supplied by hotel management, conference organizers, and exhibitors.
[1]
J. R. Scotti,et al.
Available From
,
1973
.
[2]
Abdelsalam Helal,et al.
Drishti: an integrated indoor/outdoor blind navigation system and service
,
2004,
Second IEEE Annual Conference on Pervasive Computing and Communications, 2004. Proceedings of the.
[3]
Thomas Ertl,et al.
INTERACTIVE LOCALIZATION AND RECOGNITION OF OBJECTS FOR THE BLIND
,
2006
.
[4]
Takeshi Kurata,et al.
Personal positioning based on walking locomotion analysis with self-contained sensors and a wearable camera
,
2003,
The Second IEEE and ACM International Symposium on Mixed and Augmented Reality, 2003. Proceedings..
[5]
Thomas Ertl,et al.
Interactive tracking of movable objects for the blind on the basis of environment models and perception-oriented object recognition methods
,
2006,
Assets '06.
[6]
John Nicholson,et al.
Robot-assisted wayfinding for the visually impaired in structured indoor environments
,
2006,
Auton. Robots.