Imitative behavior generation for a vision-based partner robot

This paper proposes a method for generating behaviors based on imitation of a partner robot interacting with a human. First of all, we discuss the role of imitation, and explain the method for imitative behavior generation of the robot based on computational intelligence. The robot searches for a human by using a CCD camera. A human hand motion pattern is extracted from a series of images taken from the CCD camera. Next, the position sequence of the extracted human hand is used as inputs to a spiking neural network in order to recognize it as a gesture. Furthermore, the trajectory for a behavior is generated and updated by a steady-state genetic algorithm based on human motions. Furthermore, a self-organizing map is used for clustering human hand motion patterns as gestures. Finally, we show several experimental results of imitative behavior generation through interaction with a human.