Limb-O: real-time comparison and visualization of lower limb motions

Since the rise in popularity of video-sharing platforms such as Youtube, learning new skills from the comfort of one's own home has become more accessible than ever. Though such independent learning methods are useful, they lack the real-time feedback component of being in the same room with an expert, which is why expensive private coaching sessions remain desirable. Accordingly, we propose Limb-O (orbs for limb movement visualization), a real-time quantitative virtual coach application for learning lower-limb motions through motion comparison. The proposed application turns the practice of things like sports motions into a game that highlights imperfections and allows for tracking of progress over time. A user validation study was run which confirmed that Limb-O outperforms traditional video learning methods both quantitatively and qualitatively, by providing objective feedback that keeps users engaged.

[1]  Chris Exton,et al.  A Gamification–Motivation Design Framework for Educational Software Developers , 2018, Journal of Educational Technology Systems.

[2]  P. Lachenbruch Statistical Power Analysis for the Behavioral Sciences (2nd ed.) , 1989 .

[3]  Yogendra Patil A Multi-interface VR Platform For Rehabilitation Research , 2017, CHI Extended Abstracts.

[4]  Michal Irani,et al.  Aligning Sequences and Actions by Maximizing Space-Time Correlations , 2006, ECCV.

[5]  L. Harvey,et al.  An app with remote support achieves better adherence to home exercise programs than paper handouts in people with musculoskeletal conditions: a randomised trial. , 2017, Journal of physiotherapy.

[6]  Stuart R. Lipsitz,et al.  Complementing Operating Room Teaching With Video-Based Coaching , 2017, JAMA surgery.

[7]  E Bizzi,et al.  Augmented Feedback Presented in a Virtual Environment Accelerates Learning of a Difficult Motor Task. , 1997, Journal of motor behavior.

[8]  Hyowon Lee,et al.  Towards a better video comparison: comparison as a way of browsing the video contents , 2018, OZCHI.

[9]  Cynthia Breazeal,et al.  TIKL: Development of a Wearable Vibrotactile Feedback Suit for Improved Human Motor Learning , 2007, IEEE Transactions on Robotics.

[10]  Jun Rekimoto,et al.  InterPoser: Visualizing Interpolated Movements for Bouldering Training , 2019, CHI Extended Abstracts.

[11]  Cláudio T. Silva,et al.  HistoryTracker: Minimizing Human Interactions in Baseball Game Annotation , 2019, CHI.

[12]  Jacob Cohen Statistical Power Analysis for the Behavioral Sciences , 1969, The SAGE Encyclopedia of Research Design.

[13]  Frédo Durand,et al.  Video diff , 2015, ACM Trans. Graph..

[14]  Atima Tharatipyakul Supporting Visual Temporal Media Comparison , 2017, CHI Extended Abstracts.

[15]  Zoe Marquardt,et al.  Super Mirror: a kinect interface for ballet dancers , 2012, CHI EA '12.

[16]  Celso A. S. Santos,et al.  CrowdSync: User generated videos synchronization using crowdsourcing , 2017, 2017 International Conference on Systems, Signals and Image Processing (IWSSIP).

[17]  Antonio Krüger,et al.  betaCube: Enhancing Training for Climbing by a Self-Calibrating Camera-Projection Unit , 2016, CHI Extended Abstracts.

[18]  Katherine J. Kuchenbecker,et al.  Immersive Low-Cost Virtual Reality Treatment for Phantom Limb Pain: Evidence from Two Cases , 2018, Front. Neurol..