Interactive control of avatars animated with human motion data

Real-time control of three-dimensional avatars is an important problem in the context of computer games and virtual environments. Avatar animation and control is difficult, however, because a large repertoire of avatar behaviors must be made available, and the user must be able to select from this set of behaviors, possibly with a low-dimensional input device. One appealing approach to obtaining a rich set of avatar behaviors is to collect an extended, unlabeled sequence of motion data appropriate to the application. In this paper, we show that such a motion database can be preprocessed for flexibility in behavior and efficient search and exploited for real-time avatar control. Flexibility is created by identifying plausible transitions between motion segments, and efficient search through the resulting graph structure is obtained through clustering. Three interface techniques are demonstrated for controlling avatar motion using this data structure: the user selects from a set of available choices, sketches a path through an environment, or acts out a desired motion in front of a video camera. We demonstrate the flexibility of the approach through four different applications and compare the avatar motion to directly recorded human motion.

[1]  Ming-Kuei Hu,et al.  Visual pattern recognition by moment invariants , 1962, IRE Trans. Inf. Theory.

[2]  Robert E. Tarjan,et al.  Depth-First Search and Linear Graph Algorithms , 1972, SIAM J. Comput..

[3]  Thomas W. Calvert,et al.  Goal-directed, dynamic animation of human walking , 1989, SIGGRAPH.

[4]  Norman I. Badler,et al.  Real-Time Control of a Virtual Human Using Minimal Sensors , 1993, Presence: Teleoperators & Virtual Environments.

[5]  David C. Brogan,et al.  Animating human athletics , 1995, SIGGRAPH.

[6]  Bruce Blumberg,et al.  Multi-level direction of autonomous creatures for real-time virtual environments , 1995, SIGGRAPH.

[7]  Zoran Popovic,et al.  Motion warping , 1995, SIGGRAPH.

[8]  Ken-ichi Anjyo,et al.  Fourier principles for emotion-based human figure animation , 1995, SIGGRAPH.

[9]  Ken Perlin,et al.  Real Time Responsive Animation with Personality , 1995, IEEE Trans. Vis. Comput. Graph..

[10]  Lance Williams,et al.  Motion signal processing , 1995, SIGGRAPH.

[11]  Daniel Thalmann,et al.  A real time anatomical converter for human motion capture , 1996 .

[12]  Michiel van de Panne,et al.  Motion synthesis by example , 1996 .

[13]  Eugene Fiume,et al.  Limit cycle control and its application to the animation of balancing and walking , 1996, SIGGRAPH.

[14]  Jessica K. Hodgins,et al.  Animation of Human Diving , 1996, Comput. Graph. Forum.

[15]  Thomas W. Calvert,et al.  Knowledge-Driven, Interactive Animation of Human Running , 1996, Graphics Interface.

[16]  Ken Perlin,et al.  Improv: a system for scripting interactive actors in virtual worlds , 1996, SIGGRAPH.

[17]  J. Hahn,et al.  Interpolation Synthesis of Articulated Figure Motion , 1997, IEEE Computer Graphics and Applications.

[18]  Michael Gleicher,et al.  Motion editing with spacetime constraints , 1997, SI3D.

[19]  Joshua M. Stuart,et al.  Using Chaos to Generate Choreographic Variations , 1997 .

[20]  Adrian E. Raftery,et al.  How Many Clusters? Which Clustering Method? Answers Via Model-Based Cluster Analysis , 1998, Comput. J..

[21]  Michael Gleicher,et al.  Retargetting motion to new characters , 1998, SIGGRAPH.

[22]  Bruce Blumberg,et al.  Swamped! using plush toys to direct autonomous animated characters , 1998, SIGGRAPH '98.

[23]  Sudhanshu Kumar Semwal,et al.  Mapping Algorithms for Real-Time Control of an Avatar Using Eight Sensors , 1998, Presence.

[24]  Michael F. Cohen,et al.  Verbs and Adverbs: Multidimensional Motion Interpolation , 1998, IEEE Computer Graphics and Applications.

[25]  Matthew Brand,et al.  Shadow puppetry , 1999, Proceedings of the Seventh IEEE International Conference on Computer Vision.

[26]  Daniel Thalmann,et al.  Anyone for Tennis? , 1999, Presence: Teleoperators & Virtual Environments.

[27]  Norman I. Badler,et al.  Where to Look? Automating Attending Behaviors of Virtual Human Characters , 1999, AGENTS '99.

[28]  Sung Yong Shin,et al.  A hierarchical approach to interactive motion editing for human-like figures , 1999, SIGGRAPH.

[29]  Norman I. Badler,et al.  Where to Look? Automating Attending Behaviors of Virtual Human Characters , 1999, Agents.

[30]  Christoph Bregler,et al.  Animating by multi-level sampling , 2000, Proceedings Computer Animation 2000.

[31]  R. Bowden Learning Statistical Models of Human Motion , 2000 .

[32]  Adrian Hilton,et al.  Realistic synthesis of novel human movements from a database of motion capture examples , 2000, Proceedings Workshop on Human Motion.

[33]  Justine Cassell,et al.  Embodied conversational interface agents , 2000, CACM.

[34]  Richard Szeliski,et al.  Video textures , 2000, SIGGRAPH.

[35]  Aaron Hertzmann,et al.  Style machines , 2000, SIGGRAPH 2000.

[36]  Norman I. Badler,et al.  The EMOTE model for effort and shape , 2000, SIGGRAPH.

[37]  Michael Gleicher,et al.  Comparing Constraint-Based Motion Editing Methods , 2001, Graph. Model..

[38]  3D Hand Pose Reconstruction Using Specialized Mappings , 2001, ICCV.

[39]  David C. Hogg,et al.  Learning Variable-Length Markov Models of Behavior , 2001, Comput. Vis. Image Underst..

[40]  Petros Faloutsos,et al.  The virtual stuntman: dynamic characters with a repertoire of autonomous motor skills , 2001, Comput. Graph..

[41]  Shyamsundar Rajaram,et al.  Design of a digital library for human movement , 2001, JCDL '01.

[42]  Dimitris N. Metaxas,et al.  Automating gait generation , 2001, SIGGRAPH.

[43]  Petros Faloutsos,et al.  Composable controllers for physics-based character animation , 2001, SIGGRAPH.

[44]  Harry Shum,et al.  Motion texture: a two-level statistical model for character motion synthesis , 2002, ACM Trans. Graph..

[45]  Sowmyanarayanan Sadagopan,et al.  WWW: service provider , 2002, UBIQ.

[46]  Christoph Bregler,et al.  Motion capture assisted animation: texturing and synthesis , 2002, ACM Trans. Graph..

[47]  Michael J. Black,et al.  Implicit Probabilistic Models of Human Motion for Synthesis and Tracking , 2002, ECCV.

[48]  Okan Arikan,et al.  Interactive motion generation from examples , 2002, ACM Trans. Graph..

[49]  Lucas Kovar,et al.  Motion graphs , 2002, SIGGRAPH '08.

[50]  Justine Cassell,et al.  BEAT: the Behavior Expression Animation Toolkit , 2001, Life-like characters.