Development of Mobile Robot with Preliminary-announcement and Display Function of Forthcoming Motion using Projection Equipment

This paper discusses the mobile robot PMR-5 with the preliminary-announcement and display function which indicates the forthcoming operations to the people near the robot by using a projector. The projector is set on a mobile robot and a 2-dimensional frame is projected on a running surface. In the frame, not only the scheduled course but also the states of operation can be clearly announced as the information about movement. We examine the presentation of the states of operation such as stop or going back including the time information of the scheduled course on the developed robot. Scheduled course is expressed as the arrows considering the intelligibility at sight. Arrow expresses the direction of motion directly and the length of arrow can announce the speed of motion. Operation until 3-second-later is indicated and three arrows classified by color for each second are connected and displayed so these might show the changing of speed during 3-second period. The sign for spot revolution and the characters for stop and going back are also displayed. We exhibited the robot and about 200 visitors did the questionnaire evaluation. The average of 5-stage evaluation is 4.5 points and 3.9 points for the direction of motion and the speed of motion respectively. So we obtained the evaluation that it is intelligible in general.

[1]  Naoki Mukawa,et al.  A free-head, simple calibration, gaze tracking system that enables gaze-based interaction , 2004, ETRA.

[2]  Paolo Dario,et al.  Experimental analysis of the conditions of applicability of a robot sensorimotor coordination scheme based on expected perception , 2004, 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566).

[3]  Hideaki Kuzuoka,et al.  GestureLaser and GestureLaser Car: Development of an embodied space to support remote instruction , 1999, ECSCW.

[4]  Takashi Suehiro,et al.  Information Sharing via Projection Function for Coexistence of Robot and Human , 2001, Auton. Robots.

[5]  G. van Wichert,et al.  Man-machine interaction for robot applications in everyday environments , 2001, Proceedings 10th IEEE International Workshop on Robot and Human Interactive Communication. ROMAN 2001 (Cat. No.01TH8591).

[6]  Hideaki Kuzuoka,et al.  GestureMan: a mobile robot that embodies a remote instructor's actions , 2000, CSCW '00.

[7]  Y. Matsumotot,et al.  Development of intelligent wheelchair system with face and gaze based interface , 2001, Proceedings 10th IEEE International Workshop on Robot and Human Interactive Communication. ROMAN 2001 (Cat. No.01TH8591).

[8]  T. Matsumaru,et al.  Examination by software simulation on preliminary-announcement and display of mobile robot's following action by lamp or blowouts , 2003, 2003 IEEE International Conference on Robotics and Automation (Cat. No.03CH37422).

[9]  Takafumi Matsumaru,et al.  Preliminary-announcement and display for translation and rotation of human-friendly mobile robot , 2001, Proceedings 10th IEEE International Workshop on Robot and Human Interactive Communication. ROMAN 2001 (Cat. No.01TH8591).

[10]  Erwin Prassler,et al.  Motion coordination between a human and a mobile robot , 2002, IEEE/RSJ International Conference on Intelligent Robots and Systems.

[11]  Oussama Khatib,et al.  Design and development of high-performance torque-controlled joints , 1995, IEEE Trans. Robotics Autom..

[12]  B. Graf,et al.  Non-holonomic navigation system of a walking-aid robot , 2002, Proceedings. 11th IEEE International Workshop on Robot and Human Interactive Communication.

[13]  Erwin Prassler,et al.  Key Technologies in Robot Assistants , 2002 .

[14]  Tsukasa Ogasawara,et al.  Hand pose estimation for vision-based human interface , 2001, Proceedings 10th IEEE International Workshop on Robot and Human Interactive Communication. ROMAN 2001 (Cat. No.01TH8591).

[15]  Hideaki Kuzuoka,et al.  GestureCam: a video communication system for sympathetic remote collaboration , 1994, CSCW '94.

[16]  Takeo Kanade,et al.  Facial Expression Analysis , 2011, AMFG.

[17]  Takafumi Matsumaru,et al.  Simulation of preliminary-announcement and display of mobile robot's following action by lamp, party-blowouts, or beam-light , 2003, Proceedings 2003 IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM 2003).

[18]  Tsukasa Ogasawara,et al.  A hand-pose estimation for vision-based human interfaces , 2003, IEEE Trans. Ind. Electron..

[19]  Shigeki Sugano,et al.  Emotional Communication between Humans and the Autonomous Robot WAMOEBA-2 (Waseda Amoeba) Which has the Emotion Model , 2000 .

[20]  Rolf Dieter Schraft,et al.  Care-O-bot II—Development of a Next Generation Robotic Home Assistant , 2004, Auton. Robots.