First Validation of a Generic Method for Emotional Body Posture Generation for Social Robots

Gestures for social robots are often preprogrammed off-line or generated by mapping motion capture data to the robot. Since these gestures are dependent on the robot’s joint configuration, new joint trajectories to reach the desired postures need to be implemented when using a new robot platform with a different morphology. The method proposed here aims to minimize the workload when implementing gestures on a new robot platform and facilitate the sharing of gestures between different robots. The innovative aspect of this method is that it is constructed independently of any robot configuration, and therefore it can be used to generate gestures for different robot platforms. To calculate a posture for a certain configuration, the developed method uses a set of target gestures listed in a database and maps them to that specific configuration. The method was validated on a series of configurations, including those of existing robots.Categories and Subject DescriptorsJ.4 [Computer Applications]: Social and behavioral sciences; I.2.9 [Artificial Intelligence]: Robotics