A FRIEND for Assisting Handicapped People The Semiautonomous Robotic System "FRIEND" Consists of an Electric Wheelchair with a Robotic Arm and Utilizes a Speech Interface

eople with upper-limb impairments, including peo-ple with multiple sclerosis, spasticity, cerebral palsy,paraplegia, or stroke, depend on a personal assistantfor daily life situations and in the working environ-ment. The robot arm MANUS was designed forsuch people, but they frequently cannot use it because of theirdisability. The control elements require flexibility and coordi-nationinthehandthattheydonothave.Tofacilitateaccesstotechnological aids, control functionality ofthe aids has to be improved. Additionally,it should be possible to enter more ab-stractcommandstosimplifytheuseofthesystem.Thishasbeenpartlyreal-ized in the commercially availableHANDY system. The HANDYuser can choose between five fixedactions, such as eating differentkinds of food from a tray or beingserved a drink. However, the restrictedflexibilityofHANDYisabigdisadvantage,as a fixed environment is a prerequisite.Toplacethefullcapacityoftechnicalsystemsliketherobotarm MANUS with 6 degrees of freedom (DOF) at the user’sdisposal, a shared control structure is necessary. Low-levelcommandsandtakingadvantageofallcognitiveabilitiesoftheuser lead to full control flexibility. To relieve the user,semiautonomous sensor-based actions are also included in thecontrolstructure.Actions,suchasgrippinganobjectinanun-structured environment or pouring a cup with a drink andserving it to the disabled user, may be started by simple com-mands. The user is still responsible for decisions in situationswhere errors are possible, such as locating objects or planninga sequence of preprogrammed actions.This article presents the robotic system FRIEND, whichwas developed at the University of Bremen’s Institute of Auto-mation(IAT).Thissystemoffersincreasedcontrolfunctionalityfor disabled users. FRIEND consists of an electric wheelchair,equipped with the robot arm MANUS. Both devices are con-trolled by a computer. The man-machine interface (MMI)consists on a flat screen and a speech interface. FRIEND’shardware and software are described first. The current state ofdevelopment is then presented, as well as research results thatwill be integrated soon. After a short explanation of the speechinterface, the methods developed for semiautonomous con-trol are described. These are programming by demon-stration, visual servoing, and configuration planningbased on the method of imaginary links. We alsodescribe the state of integration and our experi-ence to date.