Motion Synthesis for Sports Using Unobtrusive Lightweight Body‐Worn and Environment Sensing

The ability to accurately achieve performance capture of athlete motion during competitive play in near real‐time promises to revolutionize not only broadcast sports graphics visualization and commentary, but also potentially performance analysis, sports medicine, fantasy sports and wagering. In this paper, we present a highly portable, non‐intrusive approach for synthesizing human athlete motion in competitive game‐play with lightweight instrumentation of both the athlete and field of play. Our data‐driven puppetry technique relies on a pre‐captured database of short segments of motion capture data to construct a motion graph augmented with interpolated motions and speed variations. An athlete's performed motion is synthesized by finding a related action sequence through the motion graph using a sparse set of measurements from the performance, acquired from both worn inertial and global location sensors. We demonstrate the efficacy of our approach in a challenging application scenario, with a high‐performance tennis athlete wearing one or more lightweight body‐worn accelerometers and a single overhead camera providing the athlete's global position and orientation data. However, the approach is flexible in both the number and variety of input sensor data used. The technique can also be adopted for searching a motion graph efficiently in linear time in alternative applications.

[1]  Tido Röder,et al.  Documentation Mocap Database HDM05 , 2007 .

[2]  Nancy S. Pollard,et al.  Evaluating motion graphs for character navigation , 2004, SCA '04.

[3]  Hyun Joon Shin,et al.  Snap-together motion: assembling run-time animations , 2003, ACM Trans. Graph..

[4]  Wojciech Matusik,et al.  Practical motion capture in everyday surroundings , 2007, ACM Trans. Graph..

[5]  Long Quan,et al.  Image deblurring with blurred/noisy image pairs , 2007, SIGGRAPH 2007.

[6]  Jessica K. Hodgins,et al.  Construction and optimal search of interpolated motion graphs , 2007, ACM Trans. Graph..

[7]  Taehyun Rhee,et al.  Realtime human motion control with a small number of inertial sensors , 2011, SI3D.

[8]  David A. Forsyth,et al.  Motion synthesis from annotations , 2003, ACM Trans. Graph..

[9]  Jörn Loviscach,et al.  A Mobile Low-Cost Motion Capture System Based on Accelerometers , 2006, ISVC.

[10]  Jihong Lee,et al.  Real-Time Motion Capture for a Human Body using Accelerometers , 2001, Robotica.

[11]  Yaser Sheikh,et al.  Motion capture from body-mounted cameras , 2011, SIGGRAPH 2011.

[12]  Jessica K. Hodgins,et al.  Interactive control of avatars animated with human motion data , 2002, SIGGRAPH.

[13]  Mithilesh Kumar,et al.  A Motion Graph Approach for Interactive 3D Animation Using Low-cost Sensors , 2010, CGVR.

[14]  Jessica K. Hodgins,et al.  Action capture with accelerometers , 2008, SCA '08.

[15]  Hyun Joon Shin,et al.  Snap-together motion: assembling run-time animations , 2003, SIGGRAPH '08.

[16]  Noel E. O'Connor,et al.  TennisSense: A platform for extracting semantic information from multi-camera tennis data , 2009, 2009 16th International Conference on Digital Signal Processing.

[17]  Lucas Kovar,et al.  Motion graphs , 2002, SIGGRAPH '08.

[18]  Greg Welch,et al.  Motion Tracking: No Silver Bullet, but a Respectable Arsenal , 2002, IEEE Computer Graphics and Applications.

[19]  Sehoon Ha,et al.  Human motion reconstruction from force sensors , 2011, SCA '11.

[20]  Eamonn Keogh Exact Indexing of Dynamic Time Warping , 2002, VLDB.

[21]  Paul A. Viola,et al.  Learning silhouette features for control of human motion , 2004, SIGGRAPH '04.

[22]  Lucas Kovar,et al.  Footskate cleanup for motion capture editing , 2002, SCA '02.

[23]  Jr. G. Forney,et al.  Viterbi Algorithm , 1973, Encyclopedia of Machine Learning.

[24]  Hans-Peter Seidel,et al.  Motion reconstruction using sparse accelerometer data , 2011, TOGS.

[25]  Kung Yao,et al.  Survivor memory reduction in the Viterbi algorithm , 2005, IEEE Communications Letters.

[26]  Sung Yong Shin,et al.  On‐line motion blending for real‐time locomotion generation , 2004, Comput. Animat. Virtual Worlds.

[27]  Sung Yong Shin,et al.  On-line motion blending for real-time locomotion generation: Research Articles , 2004 .