KARES: Intelligent wheelchair-mounted robotic arm system using vision and force sensor

KARES, an acronym of KAIST Rehabilitation Engineering System, is the intelligent rehabilitation system with a 6 DOF robotic arm mounted on the powered wheelchair. It is developed to assist the disabled and the elderly for the independent activities. Human-machine interaction is a key issue of design for intelligent systems such as KARES. A special attention is paid in bestowing certain degree of autonomy to the robotic sub-system since the direct control of the robotic arm takes a high cognitive load on the user part while physically disabled persons may have difficulties in dexterously operating a joystick or push buttons for delicate movements. To perceive environment, one color vision sensor and one force/torque sensor are mounted on the end-effector of the robotic arm of KARES. To test the system, four basic tasks are defined as picking up a cup on the table, picking up a pen on the floor, moving an object to the user's face, and operating a switch on the wall. These tasks are performed autonomously in a semi-structured environment.

[1]  Adam Burke,et al.  Human Factors Engineering of a Virtual Laboratory for Students with Physical Disabilities , 1994, Presence: Teleoperators & Virtual Environments.

[2]  Zeung nam Bien,et al.  Service Robotics with Special Attention to Surgical Robots and Rehabilitation Robotics , 1996 .

[3]  Brenan J. McCarragher Task primitives for the discrete event modeling and control of 6-DOF assembly tasks , 1996, IEEE Trans. Robotics Autom..

[4]  Kazuhiko Kawamura,et al.  Trends in service robots for the disabled and the elderly , 1994, Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS'94).

[5]  Won-Kyung Song,et al.  KARES: intelligent rehabilitation robotic system for the disabled and the elderly , 1998, Proceedings of the 20th Annual International Conference of the IEEE Engineering in Medicine and Biology Society. Vol.20 Biomedical Engineering Towards the Year 2000 and Beyond (Cat. No.98CH36286).

[6]  Peter I. Corke,et al.  A tutorial on visual servo control , 1996, IEEE Trans. Robotics Autom..

[7]  Kazuhiko Kawamura,et al.  Design philosophy for service robots , 1996, Robotics Auton. Syst..

[8]  Roger Y. Tsai,et al.  A versatile camera calibration technique for high-accuracy 3D machine vision metrology using off-the-shelf TV cameras and lenses , 1987, IEEE J. Robotics Autom..

[9]  Håkan Eftring,et al.  Robotics in rehabilitation , 1995 .

[10]  Tariq Rahman,et al.  A review of design issues in rehabilitation robotics with reference to North American research , 1995 .

[11]  Alexander H. Waibel,et al.  A real-time face tracker , 1996, Proceedings Third IEEE Workshop on Applications of Computer Vision. WACV'96.

[12]  Jong-Cheol Park,et al.  Depth-estimation for visual servoing based on fuzzy logic , 1996, Proceedings of IEEE 5th International Fuzzy Systems.

[13]  Azriel Rosenfeld,et al.  Image enhancement and thresholding by optimization of fuzzy compactness , 1988, Pattern Recognit. Lett..

[14]  R. D. Jackson,et al.  Rehabilitation robotics in Europe , 1995 .