Real Time Multimodal Interaction with Animated Virtual Human

This paper describes the design and implementation of a real time animation framework in which animated virtual human is capable of performing multimodal interactions with human user. The animation system consists of several functional components, namely perception, behaviours generation, and motion generation. The virtual human agent in the system has a complex underlying geometry structure with multiple degrees of freedom (DOFs). It relies on a virtual perception system to capture information from its environment and respond to human user's commands by a combination of non-verbal behaviours including co-verbal gestures, posture, body motions and simple utterances. A language processing module is incorporated to interpret user's command. In particular, an efficient motion generation method has been developed to combines both motion captured data and parameterized actions generated in real time to produce variations in agent's behaviours depending on its momentary emotional states

[1]  Demetri Terzopoulos,et al.  Artificial Fishes: Autonomous Locomotion, Perception, Behavior, and Learning in a Simulated Physical World , 1994, Artificial Life.

[2]  Stacy Marsella,et al.  A step toward irrationality: using emotion to change belief , 2002, AAMAS '02.

[3]  M. Studdert-Kennedy Hand and Mind: What Gestures Reveal About Thought. , 1994 .

[4]  Daniel Thalmann,et al.  A Rule-Based Interactive Behavioral Animation System for Humanoids , 1999, IEEE Trans. Vis. Comput. Graph..

[5]  Justine Cassell,et al.  BEAT: the Behavior Expression Animation Toolkit , 2001, Life-like characters.

[6]  Norman I. Badler,et al.  The EMOTE model for effort and shape , 2000, SIGGRAPH.

[7]  Nadia Magnenat-Thalmann,et al.  Generic personality and emotion simulation for conversational agents , 2004, Comput. Animat. Virtual Worlds.

[8]  Bruce Blumberg,et al.  Action-selection in hamsterdam: lessons from ethology , 1994 .

[9]  A. Kendon Gesticulation and Speech: Two Aspects of the Process of Utterance , 1981 .

[10]  D. McNeill Hand and Mind: What Gestures Reveal about Thought , 1992 .

[11]  Joseph Bates,et al.  The role of emotion in believable agents , 1994, CACM.

[12]  C. Bartneck Integrating the OCC model of emotions in embodied characters , 2002 .

[13]  Michael F. Cohen,et al.  Verbs and Adverbs: Multidimensional Motion Interpolation , 1998, IEEE Computer Graphics and Applications.

[14]  Gerald L. Engel,et al.  VISUALIZATION AND COMPUTER GRAPHICS , 2005 .

[15]  Daniel Thalmann,et al.  Behavioural animation of virtual humans: what kind of laws and rules ? , 2002, Proceedings of Computer Animation 2002 (CA 2002).

[16]  Lance Williams,et al.  Motion signal processing , 1995, SIGGRAPH.

[17]  P. Johnson-Laird,et al.  Towards a Cognitive Theory of Emotions , 1987 .

[18]  John D. Lafferty,et al.  A Robust Parsing Algorithm for Link Grammars , 1995, IWPT.

[19]  Quasim H. Mehdi,et al.  A new animation approach for visualizing intelligent agent behaviours in a virtual environment , 2002, Proceedings Sixth International Conference on Information Visualisation.

[20]  J. E. Ball,et al.  Emotion and Personality in a Conversational Character , 1998 .

[21]  Stefan Kopp,et al.  Synthesizing multimodal utterances for conversational agents , 2004, Comput. Animat. Virtual Worlds.

[22]  P. Costa,et al.  Normal Personality Assessment in Clinical Practice: The NEO Personality Inventory. , 1992 .

[23]  Damian A. Isla,et al.  Creature Smarts: The Art and Architecture of a Virtual Brain , 2001 .

[24]  Andrew Ortony,et al.  The Cognitive Structure of Emotions , 1988 .