Acquiring communicative motor acts of social robot using interactive evolutionary computation

Motor control for a social robot poses challenges beyond stability and accuracy. Human observers will perceive motor actions as semantically rich, regardless of whether the robot intends the imputed meaning. Such perception on intent and emotional state transparent at an intuitive level is constrained by the robot's appearance and movement, and is also formed by those with whom it interacts. We attempt to make a robot regulate its interactions so that they suit its perceptual and motor capabilities with what humans perceive in their interactions among themselves. For this purpose, we have to design the robot's behaviors readable to humans and variable in its allowable operations. This will let both robot and human participate in natural and intuitive social interactions. We introduce an idea of visual segmentation of a stream of continuous motor actions. Based on this, we design a mobile robot that can generate a variety of semantically rich movements. By getting the feedback of how the human observer feels in seeing those, the robot evolves its behaviors so that a human observer can form consistent emotional ontologies in a cooperative fashion with the robot.