IntellWheels MMI: A Flexible Interface for an Intelligent Wheelchair

With the rising concern about the needs of people with physical disabilities and with the aging of the population there is a major concern of creating electronic devices that may improve the life of the physically handicapped and elderly person. One of these new solutions passes through the adaptation of electric wheelchairs in order to give them environmental perception, more intelligent capabilities and more adequate Human – Machine Interaction. This paper focuses in the development of a user-friendly multimodal interface, which is integrated in the Intellwheels project. This simple multimodal human-robot interface developed allows the connection of several input modules, enabling the wheelchair control through flexible input sequences of distinct types of inputs (voice, facial expressions, head movements, keyboard and, joystick). The system created is capable of storing user defined associations, of input’s sequences and corresponding output commands. The tests performed have proved the system efficiency and the capabilities of this multimodal interface.

[1]  R. Simpson Smart wheelchairs: A literature review. , 2005, Journal of rehabilitation research and development.

[2]  Luís Paulo Reis,et al.  Platform to Drive an Intelligent Wheelchair Using Facial Expressions , 2007, ICEIS.

[3]  Sharon Oviatt,et al.  Multimodal Interfaces , 2008, Encyclopedia of Multimedia.

[4]  Rodrigo A. M. Braga,et al.  Interface Framework to Drive an Intelligent Wheelchair Using Facial Expressions , 2007, 2007 IEEE International Symposium on Industrial Electronics.

[5]  Luís Paulo Reis,et al.  Development of a Realistic Simulator for Robotic Intelligent Wheelchairs in a Hospital Environment , 2009, RoboCup.

[6]  Luís Paulo Reis,et al.  Multimedia interface with an intelligent wheelchair , 2006, Computational Modeling of Objects Represented in Images.

[7]  K. Balci,et al.  A Multimodal 3D Healthcare Communication System , 2007, 2007 3DTV Conference.

[8]  Michael Johnston,et al.  MATCH: multimodal access to city help , 2001, IEEE Workshop on Automatic Speech Recognition and Understanding, 2001. ASRU '01..

[9]  Luís Paulo Reis,et al.  IntellWheels - A Development Platform for Intelligent Wheelchairs for Disabled People , 2008, ICINCO-RA.

[10]  Peter M. A. Sloot,et al.  A multi-modal interface for an interactive simulated vascular reconstruction system , 2002, Proceedings. Fourth IEEE International Conference on Multimodal Interfaces.

[11]  Philip R. Cohen,et al.  QuickSet: multimodal interaction for distributed applications , 1997, MULTIMEDIA '97.