Immersive and collaborative Taichi motion learning in various VR environments

Learning “motion” online or from video tutorials is usually inefficient since it is difficult to deliver “motion” information in traditional ways and in the ordinary PC platform. This paper presents ImmerTai, a system that can efficiently teach motion, in particular Chinese Taichi motion, in various immersive environments. ImmerTai captures the Taichi expert's motion and delivers to students the captured motion in multi-modal forms in immersive CAVE, HMD as well as ordinary PC environments. The students' motions are captured too for quality assessment and utilized to form a virtual collaborative learning atmosphere. We built up a Taichi motion dataset with 150 fundamental Taichi motions captured from 30 students, on which we evaluated the learning effectiveness and user experience of ImmerTai. The results show that ImmerTai can enhance the learning efficiency by up to 17.4% and the learning quality by up to 32.3%.

[1]  Ning Hu,et al.  Training for physical tasks in virtual environments: Tai Chi , 2003, IEEE Virtual Reality, 2003. Proceedings..

[2]  Yu Jin,et al.  A Tai Chi Training System Based on Fast Skeleton Matching Algorithm , 2012, ECCV Workshops.

[3]  Petros Daras,et al.  Quaternionic Signal Processing Techniques for Automatic Evaluation of Dance Performances From MoCap Data , 2014, IEEE Transactions on Multimedia.

[4]  M.T. Pham,et al.  Three-dimensional gesture comparison using curvature analysis of position and orientation , 2010, 2010 Annual International Conference of the IEEE Engineering in Medicine and Biology.

[5]  Hassan Foroosh,et al.  Motion Retrieval Using Low‐Rank Subspace Decomposition of Motion Volume , 2011, Comput. Graph. Forum.