Cooperative human-robot haptic navigation

This paper proposes a novel use of haptic feedback for human navigation with a mobile robot. Assuming that a path-planner has provided a mobile robot with an obstacle-free trajectory, the vehicle must steer the human from an initial to a desired target position by only interacting with him/her via a custom-designed vibro-tactile bracelet. The subject is free to decide his/her own pace and a warning vibrational signal is generated by the bracelet only when a large deviation with respect to the planned trajectory is detected by the vision sensor on-board the robot. This leads to a cooperative navigation system that is less intrusive, more flexible and easy-to-use than the ones existing in literature. The effectiveness of the proposed system is demonstrated via extensive real-world experiments.

[1]  Luigi Palopoli,et al.  Unicycle steering by brakes: A passive guidance support for an assistive cart , 2013, 52nd IEEE Conference on Decision and Control.

[2]  Karon E. MacLean,et al.  Detecting vibrations across the body in mobile contexts , 2011, CHI.

[3]  Jean-Paul Laumond,et al.  On the nonholonomic nature of human locomotion , 2008, Auton. Robots.

[4]  Francesco Chinello,et al.  Vibrotactile haptic feedback for human-robot interaction in leader-follower tasks , 2012, PETRA '12.

[5]  Cynthia Breazeal,et al.  TIKL: Development of a Wearable Vibrotactile Feedback Suit for Improved Human Motor Learning , 2007, IEEE Transactions on Robotics.

[6]  Uwe D. Hanebeck,et al.  Wide-area haptic guidance: Taking the user by the hand , 2010, 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[7]  Radu Bogdan Rusu,et al.  3D is here: Point Cloud Library (PCL) , 2011, 2011 IEEE International Conference on Robotics and Automation.

[8]  J J Koenderink,et al.  Haptic Aftereffect of Curved Surfaces , 1996, Perception.

[9]  G. G. Stokes "J." , 1890, The New Yale Book of Quotations.

[10]  Jörg Stückler,et al.  Following human guidance to cooperatively carry a large object , 2011, 2011 11th IEEE-RAS International Conference on Humanoid Robots.

[11]  S Weinstein,et al.  Intensive and extensive aspects of tactile sensitivity as a function of body part, sex, and laterality , 1968 .

[12]  Shraga Shoval,et al.  Auditory guidance with the Navbelt-a computerized travel aid for the blind , 1998, IEEE Trans. Syst. Man Cybern. Part C.

[13]  S. Lederman,et al.  Human Hand Function , 2006 .

[14]  J. Edward Colgate,et al.  Cobot architecture , 2001, IEEE Trans. Robotics Autom..

[15]  Robert Pless,et al.  Extrinsic calibration of a camera and laser range finder (improves camera calibration) , 2004, 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566).

[16]  Hendrik A. H. C. van Veen,et al.  Waypoint navigation with a vibrotactile waist belt , 2005, TAP.

[17]  Camillo J. Taylor,et al.  A vision-based formation control framework , 2002, IEEE Trans. Robotics Autom..

[18]  M. Gibson,et al.  Beyond ANOVA: Basics of Applied Statistics. , 1986 .

[19]  Andreas Riener,et al.  Sensor-actuator supported implicit interaction in driver assistance systems , 2010, Ausgezeichnete Informatikdissertationen.

[20]  Joel A. Hesch,et al.  A 3D pose estimator for the visually impaired , 2009, 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[21]  Fillia Makedon,et al.  eyeDog: an assistive-guide robot for the visually impaired , 2011, PETRA '11.