KANTRA-human-machine interaction for intelligent robots using natural language

In this paper, a new natural language interface is presented that can be applied to make the use of intelligent robots more flexible. This interface was developed for the autonomous mobile two-arm robot KAMRO, which uses several camera systems to generate an environment model and to perform assembly tasks. A fundamental requirement in human-machine interaction for intelligent robots is the ability to refer to objects in the robot's environment. Hence, the interface and the intelligent system need similar environment models and it is necessary to provide current sensor information. Additional flexibility can be achieved by integrating the man-machine interface into control architecture of the robot and to give it an access to all internal information and to the models that the robot uses for an autonomous behaviour. In order to fully exploit the capabilities or a natural language access, we present a dialogue-based interface KANTRA, in which the human-machine interaction is not restricted to unidirectional communication.<<ETX>>

[1]  Norman I. Badler,et al.  Making Them Move: Mechanics, Control & Animation of Articulated Figures , 1990 .

[2]  Tomomasa Sato,et al.  Language-aided robotic teleoperation system (LARTS) for advanced teleoperation , 1987, IEEE Journal on Robotics and Automation.

[3]  HARALD TROST,et al.  DATENBANK-DIALOG: A German language interface for relational databases , 1987, Appl. Artif. Intell..

[4]  Verzekeren Naar Sparen,et al.  Cambridge , 1969, Humphrey Burton: In My Own Time.

[5]  Norbert Reithinger,et al.  XTRA: A Natural-Language Access System to Expert Systems , 1989, Int. J. Man Mach. Stud..

[6]  Mark C. Torrance,et al.  Natural communication with robots , 1994 .

[7]  T.C. Lueth,et al.  Extensive manipulation capabilities and reliable behavior at autonomous robot assembly , 1994, Proceedings of the 1994 IEEE International Conference on Robotics and Automation.

[8]  Timothy W. Bickmore,et al.  A basic agent , 1990, Comput. Intell..

[9]  Jaime G. Carbonell,et al.  The XCALIBUR Project: A Natural Language Interface to Expert Systems , 1983, IJCAI.

[10]  David Chapman,et al.  Vision, instruction, and action , 1990 .

[11]  Kunikatsu Takase,et al.  Modular manufacturing , 1993, J. Intell. Manuf..

[12]  Fumihito Arai,et al.  Efficient communication method in the cellular robotic system , 1993, Proceedings of 1993 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS '93).

[13]  Ruzena Bajcsy,et al.  LandScan: A Natural Language and Computer Vision System for Analyzing Aerial Images , 1985, IJCAI.

[14]  Bonnie Webber,et al.  Animation from instructions , 1991 .

[15]  Thomas Rist,et al.  Instructions: Language and Behavior , 1993, IJCAI.

[16]  Nils J. Nilsson,et al.  Shakey the Robot , 1984 .

[17]  Hajime Asama Distributed Autonomous Robotic System Configurated with Multiple Agents and Its Cooperative Behaviors , 1992, J. Robotics Mechatronics.

[18]  Tim Lüth,et al.  Utilizing Spatial Relations for Natural Language Access to an Autonomous Mobile Agent , 1994, KI.

[19]  Norman K. Sondheimer Spatial reference and natural-language machine control , 1976 .

[20]  Karin Harbusch,et al.  Incremental Syntax Generation with Tree Adjoining Grammars , 1991, Wissensbasierte Systeme.

[21]  Wolfgang Wahlster,et al.  Over-Answering Yes-No Questions: Extended Responses in a NL Interface to a Vision System , 1983, IJCAI.