Motion Adaptive Orientation Adjustment of a Virtual Teacher to Support Physical Task Learning

Watching a real teacher in a real environment from a good distance and with a clear viewing angle has a significant effect on learning physical tasks. This applies to physical-task learning in a mixed-reality environment as well. Observing and imitating body motion is important for learning some physical tasks, including spatial collaborative work. When people learn a task with physical objects, they want to try and practice the task with the actual objects. They also want to keep the referential behavior model close to them at all times. Showing the virtual teacher by using mixed-reality technology can create such an environment, and thus has been researched in this study. It is known that a virtual teacher-model's position and orientation influence (a) the number of errors, and (b) the accomplishment time in physical-task learning using mixed-reality environments. This paper proposes an automatic adjustment method governing the virtual teacher's horizontal rotation angle, so that the learner can easily observe important body motions. The method divides the whole task motion into fixed duration segments, and seeks the most important moving part of the body in each segment, and then rotates the virtual teacher to show the most important part to the learner accordingly. To evaluate the method, a generic physical-task learning experiment was conducted. The method was revealed to be effective for motions that gradually reposition the most important moving part, such as in some manufacturing and cooking tasks. This study is therefore considered likely to enhance the transference of physical-task skills.

[1]  Tomoo Inoue,et al.  Physical Task Learning Support System Visualizing a Virtual Teacher by Mixed Reality , 2010, CSEDU.

[2]  Ruzena Bajcsy,et al.  Collaborative dancing in tele-immersive environment , 2006, MM '06.

[3]  Matsubara Yukihiro,et al.  VR-based Learning Support System for Operator Training -- Design and Evaluation of Basic System , 2005 .

[4]  Ruzena Bajcsy,et al.  Learning Physical Activities in Immersive Virtual Environments , 2006, Fourth IEEE International Conference on Computer Vision Systems (ICVS'06).

[5]  Konstantinos Chorianopoulos,et al.  The effects of Avatars’ Gender and Appearance on Social Behavior in Online 3D Virtual Worlds , 2010 .

[6]  Taku Komura,et al.  A Virtual Reality Dance Training System Using Motion Capture Technology , 2011, IEEE Transactions on Learning Technologies.

[7]  Gerard Jounghyun Kim,et al.  Implementation and Evaluation of Just Follow Me: An Immersive, VR-Based, Motion-Training System , 2002, Presence: Teleoperators & Virtual Environments.

[8]  Kuniaki Uehara,et al.  The Interactive Cooking Support System in Mixed Reality Environment , 2006, 2006 IEEE International Conference on Multimedia and Expo.

[9]  Tom Rodden,et al.  Turn it this way: grounding collaborative action with remote gestures , 2007, CHI.

[10]  Taku Komura,et al.  e-Learning Martial Arts , 2006, ICWL.

[11]  Ning Hu,et al.  Training for physical tasks in virtual environments: Tai Chi , 2003, IEEE Virtual Reality, 2003. Proceedings..

[12]  Mutsuo Sano,et al.  A Virtual Agent for a Cooking Navigation System Using Augmented Reality , 2008, IVA.

[13]  Kunihiro Chihara,et al.  A study of display of visualization of motion instruction supporting , 2007 .

[14]  Ronald Azuma,et al.  A Survey of Augmented Reality , 1997, Presence: Teleoperators & Virtual Environments.