Detecting deviations in visual path following for indoor environments

Blind and elderly persons often require support from a caretaker to move around for day-to-day tasks. Such persons can use assistive devices such as walkers to move around independently. For indoor environments, a camera-based system mounted on an assistive walker can help in automatic following of pre-defined paths. Such indoor paths can be learned visually during training and followed by the walker. If the walker strays away from the pre-defined path, then corrective navigation input can be provided to the user as voice outputs. Detecting deviations from the path currently being followed is a pre-condition for providing corrective navigation. This paper deals with path following in static indoor environments using a single uncalibrated camera mounted on the walker. We propose a method for visually detecting deviations from a pre-defined path without using wheel odometry or any other sensor inputs. During training, the variations in the appearance of the pre-defined path are captured by traversing the path. During path following (or testing), a similarity measure between the current test frame and the training frame of the pre-defined path is computed and used to detect deviation. As the camera may not move exactly along the same trajectory during training and testing, optical flow warping is proposed to spatially align a pair of training and testing frames. Also, due to the variations in the speed of the moving camera, temporal synchronization between the training and testing frame sequences is established using dynamic time warping. The proposed method for deviation detection in visual path following is evaluated on a set of indoor path videos and the results discussed.

[1]  Berthold K. P. Horn,et al.  Determining Optical Flow , 1981, Other Conferences.

[2]  Asim Smailagic,et al.  Metronaut: a wearable computer with sensing and global communication capabilities , 1997, Digest of Papers. First International Symposium on Wearable Computers.

[3]  David A. Ross,et al.  Talking braille: a wireless ubiquitous computing network for orientation and wayfinding , 2005, Assets '05.

[4]  Abdelsalam Helal,et al.  Drishti: an integrated indoor/outdoor blind navigation system and service , 2004, Second IEEE Annual Conference on Pervasive Computing and Communications, 2004. Proceedings of the.

[5]  Neal Lesh,et al.  Indoor navigation using a diverse set of cheap, wearable sensors , 1999, Digest of Papers. Third International Symposium on Wearable Computers.

[6]  Sunita Ram,et al.  The people sensor: a mobility aid for the visually impaired , 1998, Digest of Papers. Second International Symposium on Wearable Computers (Cat. No.98EX215).

[7]  Biing-Hwang Juang,et al.  Fundamentals of speech recognition , 1993, Prentice Hall signal processing series.

[8]  Hugh F. Durrant-Whyte,et al.  A solution to the simultaneous localization and map building (SLAM) problem , 2001, IEEE Trans. Robotics Autom..

[9]  Bohuslav Rychlik,et al.  Metronaut: A wearable computer with sensing and global communication capabilities , 2005, Personal Technologies.

[10]  David J. Fleet,et al.  Performance of optical flow techniques , 1994, International Journal of Computer Vision.

[11]  Seth J. Teller,et al.  Wide-Area Egomotion Estimation from Known 3D Structure , 2007, 2007 IEEE Conference on Computer Vision and Pattern Recognition.