Multimodal motion guidance: techniques for adaptive and dynamic feedback

The ability to guide human motion through automatically generated feedback has significant potential for applications in areas, such as motor learning, human-computer interaction, telepresence, and augmented reality. This paper focuses on the design and development of such systems from a human cognition and perception perspective. We analyze the dimensions of the design space for motion guidance systems, spanned by technologies and human information processing, and identify opportunities for new feedback techniques. We present a novel motion guidance system, that was implemented based on these insights to enable feedback for position, direction and continuous velocities. It uses motion capture to track a user in space and guides using visual, vibrotactile and pneumatic actuation. Our system also introduces motion retargeting through time warping, motion dynamics and prediction, to allow more flexibility and adaptability to user performance.

[1]  Sethuraman Panchanathan,et al.  Design, Implementation, and Case Study of a Pragmatic Vibrotactile Belt , 2011, IEEE Transactions on Instrumentation and Measurement.

[2]  C. Karen Liu,et al.  Synthesis of complex dynamic character motion from simple animations , 2002, ACM Trans. Graph..

[3]  Cynthia Breazeal,et al.  TIKL: Development of a Wearable Vibrotactile Feedback Suit for Improved Human Motor Learning , 2007, IEEE Transactions on Robotics.

[4]  Ramesh Raskar,et al.  Second Skin: Motion capture with actuated feedback for motor learning , 2010, VR.

[5]  G. Wulf,et al.  Extended Book Review: Attention and Motor Skill Learning , 2007 .

[6]  Sher ry Folsom-Meek,et al.  Human Performance , 2020, Nature.

[7]  Frank A. Geldard,et al.  Sensory saltation : metastability in the perceptual world , 1977 .

[8]  Jens Herder,et al.  Vibrotactile pitfalls: arm guidance for moderators in virtual TV studios , 2010, Humans and Computers.

[9]  Julie A. Jacko,et al.  Human-Computer Interaction Handbook: Fundamentals, Evolving Technologies, and Emerging Applications, Third Edition , 2012 .

[10]  F. Horak,et al.  Auditory biofeedback substitutes for loss of sensory information in maintaining stance , 2007, Experimental Brain Research.

[11]  Yoshinori Kuno,et al.  Dance training system with active vibro-devices and a mobile image display , 2005, 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[12]  Jan O. Borchers,et al.  Tactile motion instructions for physical activities , 2009, CHI.

[13]  Yoshihiko Nomura,et al.  A motion instruction system using head tracking back perspective , 2010, 2010 World Automation Congress.

[14]  Ali Israr,et al.  Tactile brush: drawing on skin with a tactile grid display , 2011, CHI.

[15]  Emiko Charbonneau,et al.  Teach me to dance: exploring player experience and performance in full body dance games , 2011, Advances in Computer Entertainment Technology.

[16]  Sethuraman Panchanathan,et al.  MOVeMENT: A framework for systematically mapping vibrotactile stimulations to fundamental body movements , 2010, 2010 IEEE International Symposium on Haptic Audio Visual Environments and Games.

[17]  Hyeong-Seok Ko,et al.  A physically-based motion retargeting filter , 2005, TOGS.

[18]  Hendrik A. H. C. van Veen,et al.  Waypoint navigation with a vibrotactile waist belt , 2005, TAP.

[19]  Massimo Bergamasco,et al.  Real-Time Gesture Recognition, Evaluation and Feed-Forward Correction of a Multimodal Tai-Chi Platform , 2008, HAID.

[20]  Seungyong Lee,et al.  Motion retargeting and evaluation for VR-based training of free motions , 2003, The Visual Computer.

[21]  Andrew Sears and Julie A. Jacko The human-computer interaction handbook , 2013 .

[22]  Hannes Kaufmann,et al.  Full body interaction for serious games in motor rehabilitation , 2011, AH '11.