Data‐Driven Crowd Motion Control With Multi‐Touch Gestures

Controlling a crowd using multi‐touch devices appeals to the computer games and animation industries, as such devices provide a high‐dimensional control signal that can effectively define the crowd formation and movement. However, existing works relying on pre‐defined control schemes require the users to learn a scheme that may not be intuitive. We propose a data‐driven gesture‐based crowd control system, in which the control scheme is learned from example gestures provided by different users. In particular, we build a database with pairwise samples of gestures and crowd motions. To effectively generalize the gesture style of different users, such as the use of different numbers of fingers, we propose a set of gesture features for representing a set of hand gesture trajectories. Similarly, to represent crowd motion trajectories of different numbers of characters over time, we propose a set of crowd motion features that are extracted from a Gaussian mixture model. Given a run‐time gesture, our system extracts the K nearest gestures from the database and interpolates the corresponding crowd motions in order to generate the run‐time control. Our system is accurate and efficient, making it suitable for real‐time applications such as real‐time strategy games and interactive animation controls.

[1]  Yang Li,et al.  Gesture coder: a tool for programming multi-touch gestures by demonstration , 2012, CHI.

[2]  Radu-Daniel Vatavu,et al.  Gestures as point clouds: a $P recognizer for user interface prototypes , 2012, ICMI '12.

[3]  Christian Rössl,et al.  Laplacian surface editing , 2004, SGP '04.

[4]  Haibo Chen,et al.  Geometry-constrained crowd formation animation , 2014, Comput. Graph..

[5]  Taesoo Kwon,et al.  Group motion editing , 2008, SIGGRAPH 2008.

[6]  Kyunglyul Hyun,et al.  Synchronized multi-character motion editing , 2009, SIGGRAPH 2009.

[7]  Lucas Kovar,et al.  Motion graphs , 2002, SIGGRAPH '08.

[8]  Taku Komura,et al.  Phase-functioned neural networks for character control , 2017, ACM Trans. Graph..

[9]  Zhigang Deng,et al.  Formation sketching: an approach to stylize groups in crowd simulation , 2011, Graphics Interface.

[10]  G. Toussaint Solving geometric problems with the rotating calipers , 1983 .

[11]  Taku Komura,et al.  Simulating Multiple Character Interactions with Collaborative and Adversarial Goals , 2012, IEEE Transactions on Visualization and Computer Graphics.

[12]  Charlie C. L. Wang,et al.  Interactive Control of Large-Crowd Navigation in Virtual Environments Using Vector Fields , 2008, IEEE Computer Graphics and Applications.

[13]  Yen-Lin Chen,et al.  Interactive generation of human animation with deformable motion models , 2009, TOGS.

[14]  Tony DeRose,et al.  Proton++: a customizable declarative multitouch framework , 2012, UIST.

[15]  Cristian Constantin Lalescu Two hierarchies of spline interpolations. Practical algorithms for multivariate higher order splines , 2009, ArXiv.

[16]  Hubert P. H. Shum,et al.  Real-time physical modelling of character movements with microsoft kinect , 2012, VRST '12.

[17]  Donald J. Berndt,et al.  Using Dynamic Time Warping to Find Patterns in Time Series , 1994, KDD Workshop.

[18]  Laurent Grisoni,et al.  Towards Many Gestures to One Command: A User Study for Tabletops , 2013, INTERACT.

[19]  M. V. D. Panne,et al.  Displacement Interpolation Using Lagrangian Mass Transport , 2011 .

[20]  Masaki Oshita,et al.  Sketch-Based Interface for Crowd Animation , 2009, Smart Graphics.

[21]  Taku Komura,et al.  Environment-aware real-time crowd control , 2012, SCA '12.

[22]  Jehee Lee,et al.  Morphable crowds , 2010, SIGGRAPH 2010.

[23]  Laurent Grisoni,et al.  Match-up & conquer: a two-step technique for recognizing unconstrained bimanual and multi-finger touch input , 2014, AVI.

[24]  Yang Li,et al.  Gestures without libraries, toolkits or training: a $1 recognizer for user interface prototypes , 2007, UIST.

[25]  Dinesh Manocha,et al.  Reciprocal Velocity Obstacles for real-time multi-agent navigation , 2008, 2008 IEEE International Conference on Robotics and Automation.

[26]  Rainer Groh,et al.  Towards a formalization of multi-touch gestures , 2010, ITS '10.

[27]  Taku Komura,et al.  Interactive Formation Control in Complex Environments , 2014, IEEE Transactions on Visualization and Computer Graphics.

[28]  Min Je Park Guiding flows for controlling crowds , 2009, The Visual Computer.

[29]  Dinesh Manocha,et al.  Directing Crowd Simulations Using Navigation Fields , 2011, IEEE Transactions on Visualization and Computer Graphics.

[30]  Dafydd Gibbon,et al.  A computational model of arm gestures in conversation , 2003, INTERSPEECH.

[31]  Zhigang Deng,et al.  Collective Crowd Formation Transform with Mutual Information–Based Runtime Feedback , 2015, Comput. Graph. Forum.

[32]  Jinxiang Chai,et al.  Motion graphs++ , 2012, ACM Trans. Graph..

[33]  Heekuck Oh,et al.  Neural Networks for Pattern Recognition , 1993, Adv. Comput..

[34]  Zhigang Deng,et al.  Generating Freestyle Group Formations in Agent-Based Crowd Simulations , 2013, IEEE Computer Graphics and Applications.

[35]  Adrien Treuille,et al.  Continuum crowds , 2006, SIGGRAPH 2006.

[36]  Yang Li,et al.  Gesture studio: authoring multi-touch interactions through demonstration and declaration , 2013, CHI.

[37]  Wei Liu,et al.  Unistroke gestures on multi-touch interaction: supporting flexible touches with key stroke extraction , 2012, IUI '12.

[38]  Zoran Popovic,et al.  Motion fields for interactive character locomotion , 2010, CACM.

[39]  Dimitris N. Metaxas,et al.  Eurographics/ Acm Siggraph Symposium on Computer Animation (2007) Group Behavior from Video: a Data-driven Approach to Crowd Simulation , 2022 .

[40]  Carol O'Sullivan,et al.  Path patterns: analyzing and comparing real and simulated crowds , 2016, I3D.

[41]  Mark H. Overmars,et al.  Simulating the local behaviour of small pedestrian groups , 2010, VRST '10.

[42]  Young J. Kim,et al.  Interactive generalized penetration depth computation for rigid and articulated models using object norm , 2014, ACM Trans. Graph..

[43]  Michael Cebulla,et al.  A Framework for Abstract Representation and Recognition of Gestures in Multi-touch Applications , 2010, 2010 Third International Conference on Advances in Computer-Human Interactions.

[44]  H. Yanco,et al.  Analysis of natural gestures for controlling robot teams on multi-touch tabletop surfaces , 2009, ITS '09.

[45]  Ming C. Lin,et al.  Free-flowing granular materials with two-way solid coupling , 2010, SIGGRAPH 2010.

[46]  Taesoo Kwon,et al.  Interactive manipulation of large-scale crowd animation , 2014, ACM Trans. Graph..

[47]  Takeo Igarashi,et al.  Multi-touch interface for controlling multiple mobile robots , 2009, CHI Extended Abstracts.

[48]  Taku Komura,et al.  Fast accelerometer-based motion recognition with a dual buffer framework , 2011 .

[49]  Taesoo Kwon,et al.  Spectral‐Based Group Formation Control , 2009, Comput. Graph. Forum.