eople with upper-limb impairments, including peo-ple with multiple sclerosis, spasticity, cerebral palsy,paraplegia, or stroke, depend on a personal assistantfor daily life situations and in the working environ-ment. The robot arm MANUS was designed forsuch people, but they frequently cannot use it because of theirdisability. The control elements require flexibility and coordi-nationinthehandthattheydonothave.Tofacilitateaccesstotechnological aids, control functionality ofthe aids has to be improved. Additionally,it should be possible to enter more ab-stractcommandstosimplifytheuseofthesystem.Thishasbeenpartlyreal-ized in the commercially availableHANDY system. The HANDYuser can choose between five fixedactions, such as eating differentkinds of food from a tray or beingserved a drink. However, the restrictedflexibilityofHANDYisabigdisadvantage,as a fixed environment is a prerequisite.Toplacethefullcapacityoftechnicalsystemsliketherobotarm MANUS with 6 degrees of freedom (DOF) at the user’sdisposal, a shared control structure is necessary. Low-levelcommandsandtakingadvantageofallcognitiveabilitiesoftheuser lead to full control flexibility. To relieve the user,semiautonomous sensor-based actions are also included in thecontrolstructure.Actions,suchasgrippinganobjectinanun-structured environment or pouring a cup with a drink andserving it to the disabled user, may be started by simple com-mands. The user is still responsible for decisions in situationswhere errors are possible, such as locating objects or planninga sequence of preprogrammed actions.This article presents the robotic system FRIEND, whichwas developed at the University of Bremen’s Institute of Auto-mation(IAT).Thissystemoffersincreasedcontrolfunctionalityfor disabled users. FRIEND consists of an electric wheelchair,equipped with the robot arm MANUS. Both devices are con-trolled by a computer. The man-machine interface (MMI)consists on a flat screen and a speech interface. FRIEND’shardware and software are described first. The current state ofdevelopment is then presented, as well as research results thatwill be integrated soon. After a short explanation of the speechinterface, the methods developed for semiautonomous con-trol are described. These are programming by demon-stration, visual servoing, and configuration planningbased on the method of imaginary links. We alsodescribe the state of integration and our experi-ence to date.
[1]
A. Graser,et al.
Simulation tool for kinematic configuration control technology for dexterous robots
,
1999,
IECON'99. Conference Proceedings. 25th Annual Conference of the IEEE Industrial Electronics Society (Cat. No.99CH37029).
[2]
Martin Jagersand,et al.
Visual Servoing using Trust Region Methods and Estimation of the Full Coupled Visual-Motor Jacobian
,
1996
.
[3]
Axel Gräser,et al.
Explizite symbolische Rücktransformation für redundante Robotersysteme / Explicit Symbolic Solution of the Inverse Kinematics for Redundant Robotic Systems
,
1999
.
[4]
Axel Gräser,et al.
The method of virtual points for autonomous, image-based robot control
,
1999
.
[5]
Axel Gräser,et al.
Resolving Redundancy of Series Kinematic Chains through Imaginary Links
,
1998
.
[6]
Axel Gräser,et al.
An Analytical Method for the Inverse Kinematics of Redundant Robots
,
1997
.
[7]
O. Lang,et al.
Visual control of 6 DOF robots with constant object size in the image by means of zoom camera
,
1999,
IECON'99. Conference Proceedings. 25th Annual Conference of the IEEE Industrial Electronics Society (Cat. No.99CH37029).
[8]
Katsushi Ikeuchi,et al.
Hand action perception for robot programming
,
1996,
Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems. IROS '96.