A Human-Machine Interface for assistive exoskeleton based on face analysis

This paper proposes a human machine interface for assistive exoskeleton based on face analysis. The 4 DoF assistive robotic system designed is dedicated to people suffering from myopathy and aims to compensate for the loss of mobility in the upper limb. The proposed interface is able to convert user head gesture and mouth expression into a suitable control command. Moreover, we propose a visual context analysis component to make a more accurate command. The tests conducted show that the use of vision based interface is particularly adapted to disabled people. In this paper, we will first describe the problematic and the designed mechanical system. Next, we will describe the two approaches developed for visual sensing interface: head control and mouth expression control. We will focus on mouth extraction algorithm. Finally, we introduce the context detection for scene understanding.

[1]  Paul A. Viola,et al.  Robust Real-Time Face Detection , 2001, International Journal of Computer Vision.

[2]  S-C Chen,et al.  A head orientated wheelchair for people with disabilities , 2003, Disability and rehabilitation.

[3]  S J Sheredos,et al.  Ultrasonic head controller for powered wheelchairs. , 1995, Journal of rehabilitation research and development.

[4]  Sheredos Sj,et al.  Ultrasonic head controller for powered wheelchairs. , 1995 .

[5]  W. Eric L. Grimson,et al.  Adaptive background mixture models for real-time tracking , 1999, Proceedings. 1999 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (Cat. No PR00149).

[6]  Eric Monacelli,et al.  Intelligent Assistive Exoskeleton with Vision Based Interface , 2008, ICOST.

[7]  T. Poggio,et al.  Synthesizing a color algorithm from examples. , 1988, Science.

[8]  Anil K. Jain,et al.  Face Detection in Color Images , 2002, IEEE Trans. Pattern Anal. Mach. Intell..

[9]  Larry S. Davis,et al.  Model-based object pose in 25 lines of code , 1992, International Journal of Computer Vision.

[10]  Eric Monacelli,et al.  Intelligent Camera Interface (ICI): A Challenging HMI for Disabled People , 2008, First International Conference on Advances in Computer-Human Interaction.

[11]  Alex Pentland,et al.  Pfinder: Real-Time Tracking of the Human Body , 1997, IEEE Trans. Pattern Anal. Mach. Intell..

[12]  David L. Jaffe,et al.  An ultrasonic head position interface for wheelchair control , 1982, Journal of Medical Systems.

[13]  Tomaso Poggio,et al.  Synthesizing a color algorithm from examples , 1988 .

[14]  Trevor Darrell,et al.  Head gesture recognition in intelligent interfaces: the role of context in improving recognition , 2006, IUI '06.

[15]  Jenq-Neng Hwang,et al.  Lipreading from color video , 1997, IEEE Trans. Image Process..