Autonomous mapping between motions and labels

A labeled motion library, in which robot motions are associated with semantic meanings, e.g., words, is useful for human-robot interaction, as a robot can use it to autonomously select motions to support its non-verbal communication. Manually assigning labels to new motions to a motion library is time consuming. However, a new motion may be similar to motions in the labeled motion library, and can be mapped to existing labels. We formally define motions, labels, and mappings between motions and labels. We use a NAO humanoid robot as a motivating example, though our approach is general for use on a humanoid robot with rotational joints. We explain how we generate motions and labels, define eight distance metrics to determine the similarity between motions, and use the nearest neighbor algorithm to determine the labels of a new motion. The distance metrics are varied across three axes - Euclidean versus Hausdorff, joint angles versus points of interest (postures), and mirrored versus non-mirrored. We evaluate the efficacy of these eight distance metrics, using precision, recall, and computational complexity.