Enabling gestural interaction by means of tracking dynamical systems models and assistive feedback

The computational understanding of continuous human movement plays a significant role in diverse emergent applications in areas ranging from human computer interaction to physical and neuro-rehabilitation. Non-visual feedback can aid the continuous motion control tasks that such applications frequently entail. An architecture is introduced for enabling interaction with a system that furnishes a number of gestural affordances with assistive feedback. The approach combines machine learning techniques for understanding a user's gestures with a method for the display of salient features of the underlying inference process in real time. Methods used include a particle filter to track multiple hypotheses about a user's input as the latter is unfolding, together with models of the nonlinear dynamics intrinsic to the movements of interest. Non-visual feedback in this system is based on a presentation of error features derived from an estimate of the sampled time varying probability that the user's gesture corresponds to the various tracked state trajectories in the different dynamical systems. We describe applications to interactive systems for human gait analysis and rehabilitation, a domain of considerable current interest in the movement sciences and health care.

[1]  Marimuthu Palaniswami,et al.  Computational intelligence for movement sciences : neural networks and other emerging techniques , 2006 .

[2]  Stephen A. Brewster,et al.  Overcoming the Lack of Screen Space on Mobile Computers , 2002, Personal and Ubiquitous Computing.

[3]  R. Riener,et al.  Journal of Neuroengineering and Rehabilitation Open Access Biofeedback for Robotic Gait Rehabilitation , 2022 .

[4]  Roderick Murray-Smith,et al.  Sonification of probabilistic feedback through granular synthesis , 2005, IEEE MultiMedia.

[5]  A. Opstal Dynamic Patterns: The Self-Organization of Brain and Behavior , 1995 .

[6]  William J. McDermott,et al.  Issues in Quantifying Variability From a Dynamical Systems Perspective , 2000 .

[7]  Gregory Kramer,et al.  Auditory Display: Sonification, Audification, And Auditory Interfaces , 1994 .

[8]  John G. Neuhoff,et al.  Sonification Report: Status of the Field and Research Agenda Prepared for the National Science Foundation by members of the International Community for Auditory Display , 1999 .

[9]  Michael J. Black,et al.  Recognizing temporal trajectories using the condensation algorithm , 1998, Proceedings Third IEEE International Conference on Automatic Face and Gesture Recognition.

[10]  Michelle J Johnson,et al.  Recent trends in robot-assisted therapy environments to improve real-life functional performance after stroke , 2006, Journal of NeuroEngineering and Rehabilitation.

[11]  Branko Ristic,et al.  Beyond the Kalman Filter: Particle Filters for Tracking Applications , 2004 .

[12]  John Williamson,et al.  Continuous uncertain interaction , 2006 .

[13]  Alva Noë,et al.  Action in Perception , 2006, Representation and Mind.

[14]  G. Ermentrout Dynamic patterns: The self-organization of brain and behavior , 1997 .

[15]  Jun Nakanishi,et al.  Learning Movement Primitives , 2005, ISRR.

[16]  Ian Oakley,et al.  Dynamic Primitives for Gestural Interaction , 2004, Mobile HCI.

[17]  M. Rosenstein,et al.  Nonlinear Dynamical Approaches to Human Movement , 2004 .

[18]  Yon Visell,et al.  Modeling and Continuous Sonification of Affordances for Gesture-Based Interfaces , 2007 .

[19]  Odest Chadwicke Jenkins,et al.  Monocular virtual trajectory estimation with dynamical primitives , 2006, AAAI 2006.

[20]  R. Teasell,et al.  Gait Retraining Post Stroke , 2003, Topics in stroke rehabilitation.

[21]  Stefan Schaal,et al.  http://www.jstor.org/about/terms.html. JSTOR's Terms and Conditions of Use provides, in part, that unless you have obtained , 2007 .