A framework for dynamic man-machine interaction implemented on an autonomous mobile robot

In the field of advanced service robotics it is of major importance to design man-machine interfaces that are fast and robust enough to cope with fuzzy and rapid human gesture. This paper describes our implementation of a complex interactive behavior on our multi-degree-of-freedom robot Arnold. The desired behavior is to position Arnold in front of a person and reach for the human hand, making use of all degrees of freedom the robot possesses. It is easy to see that this is a basic component in tasks where a man-machine interaction is required, e.g. passing objects or learning by showing. The complex overall behaviour can be decomposed into simple behaviors that are related to the basic robot devices, i.e. platform, arm and head. These processes act in combination without explicit exchange of information but their effect on the sensor input, following the dynamic approach described by G. Schoner et al. (1996). A fast perception of behavioral relevant information is achieved by a combination of color- and stereo-vision-algorithms in an active vision process. Closing the feedback loop by observing the environment demonstrates a good coordination of active vision and positioning of hand and platform in real time. We demonstrate that using a rapid processing of the visual input of an active camera system and controlling a humanoid robot closed loop by distributed dynamical systems is a promising way to perform robust and fast man-machine interaction.