Invited Paper: Multimodal Interface for an Intelligent Wheelchair

Since the demographics of population, with respect to age, are continuously changing, politicians and scientists start to pay more attention to the needs of senior individuals. Additionally, the well-being and needs of disabled individuals are also becoming highly valued in the political and entrepreneurial society. Intelligent wheelchairs are adapted electric wheelchairs with environmental perception, semi-autonomous behaviour and flexible human-machine-interaction. This paper presents the specification and development of a user-friendly multimodal interface, as a component of the IntellWheels Platform project. The developed prototype combines several input modules, allowing the control of the wheelchair through flexible user defined input sequences of distinct types (speech, facial expressions, head movements and joystick). To validate the effectiveness of the prototype, two experiments were performed with a number of individuals who tested the system firstly by driving a simulated wheelchair in a virtual environment. The second experiment was performed using the real IntellWheels wheelchair prototype. The results achieved proved that the multimodal interface may be successfully used by people, due to the interaction flexibility it provides.

[1]  Yunhong Wang,et al.  Video-based Face Recognition: A Survey , 2009 .

[2]  Luís Paulo Reis,et al.  IntellWheels - A Development Platform for Intelligent Wheelchairs for Disabled People , 2008, ICINCO-RA.

[3]  G. Vanacker,et al.  Adaptive Shared Control of a Brain-Actuated Simulated Wheelchair , 2007, 2007 IEEE 10th International Conference on Rehabilitation Robotics.

[4]  Luís Paulo Reis,et al.  Concept and Design of the Intellwheels Platform for Developing Intelligent Wheelchairs , 2009 .

[5]  Luís Paulo Reis,et al.  IntellWheels MMI: A Flexible Interface for an Intelligent Wheelchair , 2009, RoboCup.

[6]  Sérgio Miguel Fontes de Vasconcelos,et al.  Multimodal interface for an intelligent wheelchair , 2011 .

[7]  Yoshiaki Shirai,et al.  Robotic wheelchair based on observations of both user and environment , 1999, Proceedings 1999 IEEE/RSJ International Conference on Intelligent Robots and Systems. Human and Environment Friendly Robots with High Intelligence and Emotional Quotients (Cat. No.99CH36289).

[8]  Harsha Dayal Management of rehabilitation personnel within the context of the National Rehabilitation Policy , 2009 .

[9]  Huosheng Hu,et al.  Head gesture recognition for hands-free control of an intelligent wheelchair , 2007, Ind. Robot.

[10]  Michael Johnston,et al.  MATCHkiosk: A Multimodal Interactive City Guide , 2004, ACL.

[11]  Wayne H. Ward,et al.  Speech recognition , 1997 .

[12]  A BoltRichard,et al.  Put-that-there , 1980 .

[13]  B. Thomas,et al.  Usability Evaluation In Industry , 1996 .

[14]  Philip R. Cohen,et al.  Unification-based multimodal integration , 1997 .

[15]  D. Norman The psychology of everyday things , 1990 .

[16]  Luís Paulo Reis,et al.  Development of a Realistic Simulator for Robotic Intelligent Wheelchairs in a Hospital Environment , 2009, RoboCup.

[17]  Liyanage C. De Silva,et al.  Head gestures recognition , 2001, ICIP.

[18]  John Paulin Hansen,et al.  Gaze-controlled driving , 2009, CHI Extended Abstracts.

[19]  Eun Yi Kim,et al.  Intelligent wheelchair (IW) interface using face and mouth recognition , 2009, IUI.

[20]  Anil K. Jain,et al.  Handbook of Face Recognition, 2nd Edition , 2011 .

[21]  James Gips,et al.  Using EagleEyes—an electrodes based device for controlling the computer with your eyes—to help people with special needs , 1996 .

[22]  Nuno Lau,et al.  Ciber-Rato: Um Ambiente de Simulação de Robots Móveis e Autónomos , 2002 .

[23]  Akira Sasou,et al.  Noise Robust Speech Recognition Applied to Voice-Driven Wheelchair , 2009, EURASIP J. Adv. Signal Process..

[24]  Y. Matsumotot,et al.  Development of intelligent wheelchair system with face and gaze based interface , 2001, Proceedings 10th IEEE International Workshop on Robot and Human Interactive Communication. ROMAN 2001 (Cat. No.01TH8591).

[25]  Christian Laugier,et al.  Controlling a Wheelchair Indoors Using Thought , 2007, IEEE Intelligent Systems.

[26]  J. Jacko,et al.  The human-computer interaction handbook: fundamentals, evolving technologies and emerging applications , 2002 .

[27]  J. Cohn,et al.  A Psychometric Evaluation of the Facial Action Coding System for Assessing Spontaneous Expression , 2001 .

[28]  Giulio Fontana,et al.  Brain Control of a Smart Wheelchair , 2008 .

[29]  M Youdin,et al.  A voice controlled powered wheelchair and environmental control system for the severely disabled. , 1980, Medical progress through technology.

[30]  Sharon L. Oviatt,et al.  Multimodal Interfaces: A Survey of Principles, Models and Frameworks , 2009, Human Machine Interaction.

[31]  Richard A. Bolt,et al.  “Put-that-there”: Voice and gesture at the graphics interface , 1980, SIGGRAPH '80.

[32]  Andrey Ronzhin,et al.  ICANDO: Low cost multimodal interface for hand disabled people , 2008, Journal on Multimodal User Interfaces.

[33]  Dirk Heylen,et al.  Bacteria Hunt , 2010, Journal on Multimodal User Interfaces.

[34]  Vladimir Pavlovic,et al.  Toward multimodal human-computer interface , 1998, Proc. IEEE.