Measuring intent in human-robot cooperative manipulation

To effectively interact with people in a physically assistive role, robots will need to be able to cooperatively manipulate objects with a human partner. For example, it can be very difficult for an individual to manipulate a long or heavy object. An assistant can help to share the load, and improve the maneuverability of the object. Each partner can communicate objectives (e.g., move around an obstacle or put the object down) via non-verbal cues (e.g., moving the end of the object in a particular direction, changing speed, or tugging). Herein, non-verbal communication in a human-robot coordinated manipulation task is addressed using a small articulated robot arm equipped with a 6-axis wrist mounted force/torque sensor and joint angle encoders. The robot controller uses a Jacobian Transpose velocity PD control scheme with gravity compensation. To aid collaborative manipulation we implement a uniform impedance controller at the robot end-effector with an attractive force to a virtual path in the style of a cobot. Unlike a cobot, this path is recomputed online as a function of user input. In our present research, we utilize force/torque sensor measurements to identify intentional user communications specifying a change in the task direction. We consider the impact of path recomputation and the resulting robot haptic feedback on user physiological response.

[1]  Kazuhiro Kosuge,et al.  Adaptive Guidance for the Elderly Based on User Intent and Physical Impairment , 2006, ROMAN 2006 - The 15th IEEE International Symposium on Robot and Human Interactive Communication.

[2]  Mitsuru Ishizuka,et al.  AutoSelect: What You Want Is What You Get: Real-Time Processing of Visual Attention and Affect , 2006, PIT.

[3]  Peter Corke,et al.  In situ Measurement of Robot Motor Electrical Constants , 1996 .

[4]  Veikko Surakka,et al.  Pupillary responses to emotionally provocative stimuli , 2000, ETRA.

[5]  Peter I. Corke In situ measurement of motor electrical constants , 1996, Robotica.

[6]  G. Oriolo,et al.  Robotics: Modelling, Planning and Control , 2008 .

[7]  T. Murakami,et al.  Human interactive motion of redundant manipulator by virtual nonholonomic constraint , 2004, The 8th IEEE International Workshop on Advanced Motion Control, 2004. AMC '04..

[8]  John L. Andreassi,et al.  Psychophysiology: Human Behavior & Physiological Response , 2000 .

[9]  Dana Kulic,et al.  Affective State Estimation for Human–Robot Interaction , 2007, IEEE Transactions on Robotics.

[10]  Nilanjan Sarkar,et al.  Online stress detection using psychophysiological signals for implicit human-robot cooperation , 2002, Robotica.

[11]  E. Burdet,et al.  Elastic Path Controller for Assistive Devices , 2005, 2005 IEEE Engineering in Medicine and Biology 27th Annual Conference.

[12]  William R. Hamel,et al.  Force assistance function for human-machine cooperative telerobotics using fuzzy logic , 2002, Proceedings 2002 IEEE International Conference on Robotics and Automation (Cat. No.02CH37292).

[13]  Dana Kulic,et al.  Physiological and subjective responses to articulated robot motion , 2006, Robotica.

[14]  Kazuo Tanie,et al.  Human-Robot Cooperative Manipulation Using a Virtual Nonholonomic Constraint , 2002, Int. J. Robotics Res..

[15]  Jennifer Healey,et al.  Toward Machine Emotional Intelligence: Analysis of Affective Physiological State , 2001, IEEE Trans. Pattern Anal. Mach. Intell..

[16]  Nilanjan Sarkar,et al.  Anxiety detecting robotic system – towards implicit human-robot collaboration , 2004, Robotica.

[17]  Jonathan Klein,et al.  Frustrating the user on purpose: a step toward building an affective computer , 2002, Interact. Comput..

[18]  K. H. Kim,et al.  Emotion recognition system using short-term monitoring of physiological signals , 2004, Medical and Biological Engineering and Computing.