Let's shake hands! On the coordination of gestures of humanoids

Hand gestures are important means of expressivity for humanoids. In this paper by humanoid we understand user-controlled or autonomous virtual humans (VHs) or conversational agents (ECAs) [4] as well as human-like robots [18]. We cover both cases of human-humanoid and humanoid-humanoid interaction. The semantics, the morphology, the variations in performance of gestures reflecting cultural, affective and other characteristics of the speaker [8]. as well as general gesture movement laws [6] have been addressed Our focus in this paper is the issue of coordination of hand gestures to external signals. One type of coordination, alignment of speech-accompanying gestures to the speech, has been studied extensively, and different design principles have been formulated and implemented for specific applications with virtual humans [11, 13, 25]. In these cases, the phonological synchrony rule [15] have been taken as basis, usually resulting in gestures timed to the speech – even if it is generated by TTS. An exception is [24], where in assembly tasks, where a physical manipulation may be accomplished in a shorter or longer time, the speech is aligned to the manipulative hand gestures. Another domain where two-handed gestures play a role is sign language [9]. Also, mechanisms for fast planning for deictic gestures have been proposed [14]. Our ongoing research extends these works in the following aspects: • We propose a coordination scheme which is more general, allowing to take into consideration external events such as tempo indication or perceived state information about the interlocutor of the ECA. • We allow the declaration of coordination requirements on a low level of granularity, looking at different stages of gestures. Such a refined approach makes it possible to perform experiments on e.g. expressivity and style, and to include timing strategies as a means to fine-tune the gesturing behavior of a humanoid. • Our main interest is in reactive scheduling and planning of gestures with reference to an environment influencing the timing of the gestures. • We are using the (still under development) BML language for the formulation of scheduling requirements. As BML is meant to become a general-purpose markup language [3], our testing and extension of its constructs contributes to the development of this unifying language.

[1]  Matt Huenerfauth Representing coordination and non-coordination in an american sign language animation , 2005, Assets '05.

[2]  Alessandro Duranti,et al.  Universal and Culture‐Specific Properties of Greetings , 1997 .

[3]  Rachid Alami,et al.  A methodological approach relating the classification of gesture to identification of human intent in the context of human-robot interaction , 2005, ROMAN 2005. IEEE International Workshop on Robot and Human Interactive Communication, 2005..

[4]  Ipke Wachsmuth,et al.  Anticipation in a VR-based anthropomorphic construction assistant , 2003 .

[5]  M. Argyle,et al.  Gaze and Mutual Gaze , 1994, British Journal of Psychiatry.

[6]  C. Creider Hand and Mind: What Gestures Reveal about Thought , 1994 .

[7]  Cynthia Breazeal,et al.  Social interactions in HRI: the robot view , 2004, IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews).

[8]  Franck Poirier,et al.  Gesture Analysis: Invariant Laws in Movement , 2003, Gesture Workshop.

[9]  Matt Huenerfauth Representing coordination and non-coordination in American Sign Language animations , 2006, Behav. Inf. Technol..

[10]  J. Cassell,et al.  Embodied conversational agents , 2000 .

[11]  Brigitte Krenn,et al.  Defining the Gesticon : Language and Gesture Coordination for Interacting Embodied Agents , 2004 .

[12]  Anton Nijholt,et al.  Interacting with a Virtual Conductor , 2006, ICEC.

[13]  Christopher E. Peters Evaluating Perception of Interaction Initiation in Virtual Environments Using Humanoid Agents , 2006, ECAI.

[14]  Dennis Reidsma,et al.  Towards a Reactive Virtual Trainer , 2006, IVA.

[15]  Rudolf Arnheim,et al.  Hand and Mind: What Gestures Reveal About Thought by David McNeill (review) , 2017 .

[16]  Stefan Kopp,et al.  Model-based animation of co-verbal gesture , 2002, Proceedings of Computer Animation 2002 (CA 2002).

[17]  Daniel Thalmann,et al.  Virtual and Real Humans Interacting in the Virtual World , 1995 .

[18]  P. Greenbaum,et al.  Varieties of touching in greetings: Sequential structure and sex-related differences , 1980 .

[19]  Stefan Kopp,et al.  Lifelike Gesture Synthesis and Timing for Conversational Agents , 2001, Gesture Workshop.

[20]  Maurizio Mancini,et al.  Implementing Expressive Gesture Synthesis for Embodied Conversational Agents , 2005, Gesture Workshop.

[21]  Candace L. Sidner,et al.  Explorations in engagement for humans and robots , 2005, Artif. Intell..

[22]  A. Kendon Movement coordination in social interaction: some examples described. , 1970, Acta psychologica.

[23]  Stefan Kopp,et al.  Towards a Common Framework for Multimodal Generation: The Behavior Markup Language , 2006, IVA.

[24]  Atsushi Konno,et al.  U-Tsu-Shi-O-Mi: the virtual humanoid you can reach , 2006, SIGGRAPH '06.

[25]  James C. Lester,et al.  Deictic Believability: Coordinated Gesture, Locomotion, and Speech in Lifelike Pedagogical Agents , 1999, Appl. Artif. Intell..