Cooperative Navigation for Mixed Human–Robot Teams Using Haptic Feedback

In this paper, we present a novel cooperative navigation control for human–robot teams. Assuming that a human wants to reach a final location in a large environment with the help of a mobile robot, the robot must steer the human from the initial to the target position. The challenges posed by cooperative human–robot navigation are typically addressed by using haptic feedback via physical interaction. In contrast with that, in this paper, we describe a different approach, in which the human–robot interaction is achieved via wearable vibrotactile armbands. In the proposed work, the subject is free to decide her/his own pace. A warning vibrational signal is generated by the haptic armbands when a large deviation with respect to the desired pose is detected by the robot. The proposed method has been evaluated in a large indoor environment, where 15 blindfolded human subjects were asked to follow the haptic cues provided by the robot. The participants had to reach a target area, while avoiding static and dynamic obstacles. Experimental results revealed that the blindfolded subjects were able to avoid the obstacles and safely reach the target in all of the performed trials. A comparison is provided between the results obtained with blindfolded users and experiments performed with sighted people.

[1]  Sebastian Thrun,et al.  Probabilistic robotics , 2002, CACM.

[2]  Rupert G. Miller Beyond ANOVA, basics of applied statistics , 1987 .

[3]  Carlos Canudas de Wit,et al.  NONLINEAR CONTROL DESIGN FOR MOBILE ROBOTS , 1994 .

[4]  Francesco Chinello,et al.  Vibrotactile haptic feedback for human-robot interaction in leader-follower tasks , 2012, PETRA '12.

[5]  Radu Bogdan Rusu,et al.  3D is here: Point Cloud Library (PCL) , 2011, 2011 IEEE International Conference on Robotics and Automation.

[6]  Henrik I. Christensen,et al.  Guidance for human navigation using a vibro-tactile belt interface and robot-like motion planning , 2014, 2014 IEEE International Conference on Robotics and Automation (ICRA).

[7]  A. A. Collins,et al.  Vibrotactile localization on the arm: Effects of place, space, and age , 2003, Perception & psychophysics.

[8]  Roger W. Cholewiak,et al.  Sensory and Physiological Bases of Touch , 1991 .

[9]  Kaspar Althoefer,et al.  Identification of Haptic Based Guiding Using Hard Reins , 2015, PloS one.

[10]  Luigi Palopoli,et al.  Unicycle steering by brakes: A passive guidance support for an assistive cart , 2013, 52nd IEEE Conference on Decision and Control.

[11]  J. Edward Colgate,et al.  Cobot architecture , 2001, IEEE Trans. Robotics Autom..

[12]  Robert W. Lindeman,et al.  Wearable vibrotactile systems for virtual contact and information display , 2006, Virtual Reality.

[13]  Hendrik A. H. C. van Veen,et al.  Waypoint navigation with a vibrotactile waist belt , 2005, TAP.

[14]  Camillo J. Taylor,et al.  A vision-based formation control framework , 2002, IEEE Trans. Robotics Autom..

[15]  Sandra Hirche,et al.  Multi-robot manipulation controlled by a human with haptic feedback , 2015, 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[16]  Jörg Stückler,et al.  Following human guidance to cooperatively carry a large object , 2011, 2011 11th IEEE-RAS International Conference on Humanoid Robots.

[17]  Andrew W. Fitzgibbon,et al.  Direct Least Square Fitting of Ellipses , 1999, IEEE Trans. Pattern Anal. Mach. Intell..

[18]  C. E. Chapman,et al.  Perception of vibrotactile stimuli during motor activity in human subjects , 2004, Experimental Brain Research.

[19]  Kaspar Althoefer,et al.  A two party haptic guidance controller via a hard rein , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[20]  J J Koenderink,et al.  Haptic Aftereffect of Curved Surfaces , 1996, Perception.

[21]  Philippe Fraisse,et al.  Experimental study on haptic communication of a human in a shared human-robot collaborative task , 2012, 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[22]  Andreas Riener,et al.  Sensor-actuator supported implicit interaction in driver assistance systems , 2010, Ausgezeichnete Informatikdissertationen.

[23]  S Weinstein,et al.  Intensive and extensive aspects of tactile sensitivity as a function of body part, sex, and laterality , 1968 .

[24]  Fillia Makedon,et al.  eyeDog: an assistive-guide robot for the visually impaired , 2011, PETRA '11.

[25]  S. LaValle Rapidly-exploring random trees : a new tool for path planning , 1998 .

[26]  M. Heller The Psychology of Touch , 1991 .

[27]  Karon E. MacLean,et al.  Detecting vibrations across the body in mobile contexts , 2011, CHI.

[28]  Jean-Paul Laumond,et al.  On the nonholonomic nature of human locomotion , 2008, Auton. Robots.

[29]  Cynthia Breazeal,et al.  TIKL: Development of a Wearable Vibrotactile Feedback Suit for Improved Human Motor Learning , 2007, IEEE Transactions on Robotics.

[30]  Uwe D. Hanebeck,et al.  Wide-area haptic guidance: Taking the user by the hand , 2010, 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[31]  William R. Provancher,et al.  Mobile Navigation Using Haptic, Audio, and Visual Direction Cues with a Handheld Test Platform , 2012, IEEE Transactions on Haptics.

[32]  Domenico Prattichizzo,et al.  Cooperative human-robot haptic navigation , 2014, 2014 IEEE International Conference on Robotics and Automation (ICRA).

[33]  Lyuba Alboul,et al.  Following a Robot using a Haptic Interface without Visual Feedback , 2014, ACHI 2014.