JESTKOD database: Dyadic interaction analysis

In the nature of human-to-human communication, gesture and speech co-exist in time with a tight synchrony. We tend to use gestures to complement or to emphasize speech. In this study we present the JESTKOD database, which will be a valuable asset to examine gesture and speech in defining more natural human-computer interaction systems. This JESTKOD database consists of speech and motion capture data recordings of dyadic interactions under friendly and unfriendly interaction scenarios. In this paper we present our multimodal data collection process as well as the early experimental studies on friendly/unfriendly classification of dyadic interactions using body gesture and speech data.