Tactile gestures for human/robot interaction

Gesture-based programming is a new paradigm to ease the burden of programming robots. By tapping in to the user's wealth of experience with contact transitions, compliance, uncertainty and operations sequencing, we hope to provide a more intuitive programming environment for complex, real-world tasks based on the expressiveness of nonverbal communication. A requirement for this to be accomplished is the ability to interpret gestures to infer the intentions behind them. As a first step toward this goal, this paper presents an application of distributed perception for inferring a user's intentions by observing tactile gestures. These gestures consist of sparse, inexact, physical "nudges" applied to the robot's end effector for the purpose of modifying its trajectory in free space. A set of independent agents-each with its own local, fuzzified, heuristic model of a particular trajectory parameter observes data from a wristforce/torque sensor to evaluate the gestures. The agents then independently determine the confidence of their respective findings and distributed arbitration resolves the interpretation through voting.

[1]  Victor R. Lesser,et al.  A Retrospective View of the Hearsay-II Architecture , 1977, IJCAI.

[2]  J. Knott The organization of behavior: A neuropsychological theory , 1951 .

[3]  Donald D. Hoffman,et al.  Parts of recognition , 1984, Cognition.

[4]  Rodney A. Brooks,et al.  A Robust Layered Control Syste For A Mobile Robot , 2022 .

[5]  Richard M. Voyles,et al.  Multi-Agent Perception for Human/Robot Interaction: A Framework for Intuitive Trajectory Modification , 1994 .

[6]  Katsushi Ikeuchi,et al.  Grasp recognition and manipulative motion characterization from human hand motion sequences , 1994, Proceedings of the 1994 IEEE International Conference on Robotics and Automation.

[7]  Tomomasa Sato,et al.  Motion Understanding for World Model Management of Telerobot , 1989 .

[8]  D. J. Todd Fundamentals of robot technology , 1986 .

[9]  Pradeep K. Khosla,et al.  Sensorimotor primitives for robotic assembly skills , 1995, Proceedings of 1995 IEEE International Conference on Robotics and Automation.

[10]  W. Richards,et al.  Boundaries of Visual Motion , 1985 .

[11]  Pradeep K. Khosla,et al.  A software architecture-based human-machine interface for reconfigurable sensor-based control systems , 1993, Proceedings of 8th IEEE International Symposium on Intelligent Control.

[12]  Dana H. Ballard,et al.  Recognizing teleoperated manipulations , 1993, [1993] Proceedings IEEE International Conference on Robotics and Automation.

[13]  Oussama Khatib,et al.  A unified approach for motion and force control of robot manipulators: The operational space formulation , 1987, IEEE J. Robotics Autom..