TOWARDS THE DREAM OF AN INTELLIGENT , VISUALLY-GUIDED WHEELCHAIR

An important focus of research into smart wheelchairs aims to assist people living with physical disabilities and possibly without speech but with normal cognitive functioning, vision and audition. What is sought is a wheelchair that is intelligent, that can visually understand the world and will do precisely what its operator wants, without tedious and detailed user control. This dream has yet to be achieved [21].

[1]  Z. Zenn Bien,et al.  Blend of soft computing techniques for effective human-machine interaction in service robotic systems , 2003, Fuzzy Sets Syst..

[2]  John K. Tsotsos,et al.  Modeling Visual Attention via Selective Tuning , 1995, Artif. Intell..

[3]  Yoshiaki Shirai,et al.  Look where you're going [robotic wheelchair] , 2003, IEEE Robotics Autom. Mag..

[4]  Paul Nisbet,et al.  Assessment and training of children for powered mobility in the UK , 2002 .

[5]  G. Fraser Shein Towards task transparency in alternative computer access, selection of text through switch-based scanning , 1997 .

[6]  S P Levine,et al.  The NavChair Assistive Wheelchair Navigation System. , 1999, IEEE transactions on rehabilitation engineering : a publication of the IEEE Engineering in Medicine and Biology Society.

[7]  Yiming Ye,et al.  Sensor Planning for 3D Object Search, , 1999, Comput. Vis. Image Underst..

[8]  John K. Tsotsos,et al.  Integration of camera motion behaviours for active object recognition , 1994 .

[9]  A. Lankenau,et al.  Safety in robotics: the Bremen Autonomous Wheelchair , 1998, AMC'98 - Coimbra. 1998 5th International Workshop on Advanced Motion Control. Proceedings (Cat. No.98TH8354).

[10]  Sven J. Dickinson,et al.  PLAYBOT A visually-guided robot for physically disabled children , 1998, Image Vis. Comput..

[11]  Manuel Mazo,et al.  An integral system for assisted mobility [automated wheelchair] , 2001, IEEE Robotics Autom. Mag..

[12]  John K. Tsotsos Intelligent control for perceptually attentive agents: The S* proposal , 1997, Robotics Auton. Syst..

[13]  Vijay Kumar,et al.  Human robot interaction: application to smart wheelchairs , 2002, Proceedings 2002 IEEE International Conference on Robotics and Automation (Cat. No.02CH37292).

[14]  Karl-Friedrich Kraiss,et al.  Supervised navigation and manipulation for impaired wheelchair users , 2004, 2004 IEEE International Conference on Systems, Man and Cybernetics (IEEE Cat. No.04CH37583).

[15]  John K. Tsotsos,et al.  Robot middleware must support task-directed perception , 2007 .

[16]  Michael J. Black,et al.  Mixture models for optical flow computation , 1993, Proceedings of IEEE Conference on Computer Vision and Pattern Recognition.

[17]  Reid G. Simmons,et al.  A task description language for robot control , 1998, Proceedings. 1998 IEEE/RSJ International Conference on Intelligent Robots and Systems. Innovations in Theory, Practice and Applications (Cat. No.98CH36190).

[18]  Donald Lee Pieper The kinematics of manipulators under computer control , 1968 .

[19]  Elizabeth S. Helfman,et al.  Blissymbolics, speaking without speech , 1980 .

[20]  Dong-Soo Kwon,et al.  Integration of a Rehabilitation Robotic System (KARES II) with Human-Friendly Man-Machine Interaction Units , 2004, Auton. Robots.

[21]  R. Simpson Smart wheelchairs: A literature review. , 2005, Journal of rehabilitation research and development.

[22]  Michael E. Cleary,et al.  The deictically controlled wheelchair , 1998, Image Vis. Comput..

[23]  Antonis A. Argyros,et al.  Semi-autonomous Navigation of a Robotic Wheelchair , 2002, J. Intell. Robotic Syst..

[24]  T. Gomi,et al.  The development of an intelligent wheelchair , 1996, Proceedings of Conference on Intelligent Vehicles.