Development of Four Kinds of Mobile Robot with Preliminary-Announcement and Indication Function of Upcoming Operation

We propose approaches and equipment for preliminarily announcing and indicating to people the speed and direction of movement of mobile robots moving on a two-dimensional plane. We introduce the four approaches categorized into (1) announcing the state just after the present and (2) indicating operations from the present to some future time continuously. To realize the approaches, we use omni-directional display (PMR-2), flat-panel display (PMR-6), laser pointer (PMR-1), and projection equipment (PMR-5) for the announcement unit of protobots. The four protobots were exhibited at the 2005 International Robot Exhibition (iREX05). We had visitors answer questionnaires in a 5-stage evaluation. The projector robot PMR-5 received the highest evaluation score among the four. An examination of differences by gender and age suggested that some people prefer simple information, friendly expressions, and a minimum of information to be presented at one time.

[1]  Hiroshi Ishii,et al.  Illuminating clay: a 3-D tangible interface for landscape analysis , 2002, CHI.

[2]  Koichi Koganezawa,et al.  Mechanical Stiffness Control of Tendon-Driven Joints , 2000 .

[3]  Tsukasa Ogasawara,et al.  Hand pose estimation for vision-based human interface , 2001, Proceedings 10th IEEE International Workshop on Robot and Human Interactive Communication. ROMAN 2001 (Cat. No.01TH8591).

[4]  Yoji Yamada,et al.  Evaluation of Human Pain Tolerance , 1995 .

[5]  Erwin Prassler,et al.  Key Technologies in Robot Assistants , 2002 .

[6]  Yoichi Sato,et al.  Integrating paper and digital information on EnhancedDesk: a method for realtime finger tracking on an augmented desk system , 2001, TCHI.

[7]  Masayuki Inaba,et al.  View-based approach to robot navigation , 2000, Proceedings. 2000 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2000) (Cat. No.00CH37113).

[8]  Toshihiro Matsui,et al.  An Integrated Teleoperation Method for Robots using Multi-Media-Display , 1988 .

[9]  Takeo Kanade,et al.  Facial Expression Analysis , 2011, AMFG.

[10]  Takafumi Matsumaru,et al.  Model for analysis of weight lifting motion considering the abdominal pressure increased by Valsalva maneuver , 2006 .

[11]  T. Matsumaru,et al.  Examination by software simulation on preliminary-announcement and display of mobile robot's following action by lamp or blowouts , 2003, 2003 IEEE International Conference on Robotics and Automation (Cat. No.03CH37422).

[12]  Yoshiaki Shirai,et al.  Look where you're going [robotic wheelchair] , 2003, IEEE Robotics Autom. Mag..

[13]  旭 杉本 Robot and Safety , 1985 .

[14]  Motoyuki Akamatsu,et al.  Establishing Driving Behavior Database and its Application to Active Safety Technologies , 2003 .

[15]  Noboru Sugimoto Robot-Safety and Intelligent Fail-Safe , 1984 .

[16]  Tsukasa Ogasawara,et al.  A hand-pose estimation for vision-based human interfaces , 2003, IEEE Trans. Ind. Electron..

[17]  Shigeki Sugano,et al.  Emotional Communication between Humans and the Autonomous Robot WAMOEBA-2 (Waseda Amoeba) Which has the Emotion Model , 2000 .

[18]  Ryojun Ikeura,et al.  Previous notice method of robotic arm motion for suppressing threat to human , 2001 .

[19]  Takafumi Matsumaru,et al.  Mobile Robot with Preliminary-Announcement Function of Forthcoming Motion using Light-ray , 2006, 2006 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[20]  Kenji Mase Automatic Extraction and Recognition of Face and Gesture , 1998 .

[21]  Lars Bretzner,et al.  Hand gesture recognition using multi-scale colour features, hierarchical models and particle filtering , 2002, Proceedings of Fifth IEEE International Conference on Automatic Face Gesture Recognition.

[22]  Oussama Khatib,et al.  Design and development of high-performance torque-controlled joints , 1995, IEEE Trans. Robotics Autom..

[23]  Takafumi Matsumaru,et al.  Preliminary-announcement and display for translation and rotation of human-friendly mobile robot , 2001, Proceedings 10th IEEE International Workshop on Robot and Human Interactive Communication. ROMAN 2001 (Cat. No.01TH8591).

[24]  Pierre Wellner The DigitalDesk calculator: tangible manipulation on a desk top display , 1991, UIST '91.

[25]  Charles E. Thorpe,et al.  PdaDriver: A Handheld System for Remote Driving , 2003 .

[26]  Naoki Mukawa,et al.  A free-head, simple calibration, gaze tracking system that enables gaze-based interaction , 2004, ETRA.

[27]  Takeo Kanade,et al.  Recognizing lower face action units for facial expression analysis , 2000, Proceedings Fourth IEEE International Conference on Automatic Face and Gesture Recognition (Cat. No. PR00580).

[28]  Hiroshi Ishii,et al.  Sensetable: a wireless object tracking platform for tangible user interfaces , 2001, CHI.

[29]  Shigeki Sugano,et al.  Anticollision Safety Design and Control Methodology for Human-Symbiotic Robot Manipulator , 1998 .

[30]  Erwin Prassler,et al.  Motion coordination between a human and a mobile robot , 2002, IEEE/RSJ International Conference on Intelligent Robots and Systems.

[31]  Hande Kaymaz-Keskinpala,et al.  PDA-based human-robotic interface , 2003, SMC'03 Conference Proceedings. 2003 IEEE International Conference on Systems, Man and Cybernetics. Conference Theme - System Security and Assurance (Cat. No.03CH37483).

[32]  Takafumi Matsumaru,et al.  Examination on a Software Simulation of the Method and Effect of Preliminary-announcement and Display of Human-friendly Robot's Following Action , 2004 .

[33]  Takafumi Matsumaru,et al.  Mobile Robot with Eyeball Expression as the Preliminary-Announcement and Display of the Robot’s Following Motion , 2005, Auton. Robots.

[34]  B. Graf,et al.  Non-holonomic navigation system of a walking-aid robot , 2002, Proceedings. 11th IEEE International Workshop on Robot and Human Interactive Communication.

[35]  Ian D. Reid,et al.  Dynamic Classifier for Non-rigid Human motion analysis , 2004, BMVC.

[36]  Jun Rekimoto,et al.  SmartSkin: an infrastructure for freehand manipulation on interactive surfaces , 2002, CHI.

[37]  Guoguang Zhang,et al.  Development of ER Actuator and Basic Study on its Force Control System , 1998 .

[38]  Ryojun Ikeura,et al.  Previous Notice Method of Robotic Arm Motion for Suppressing Threat to Human (Notice of the three dimensional hand motion of the robotic arm) , 2001 .

[39]  Yanxi Liu,et al.  Facial asymmetry quantification for expression invariant human identification , 2002, Proceedings of Fifth IEEE International Conference on Automatic Face Gesture Recognition.

[40]  Masayuki Inaba,et al.  A Full-Body Tactile Sensor Suit Using Electrically Conductive Fabric , 1998 .

[41]  Shigeyuki Sakane,et al.  A Human-Robot Interface Using an Extended Digital Desk Approach , 1998 .

[42]  Takafumi Matsumaru,et al.  Simulation of preliminary-announcement and display of mobile robot's following action by lamp, party-blowouts, or beam-light , 2003, Proceedings 2003 IEEE/ASME International Conference on Advanced Intelligent Mechatronics (AIM 2003).

[43]  Takashi Suehiro,et al.  Information Sharing via Projection Function for Coexistence of Robot and Human , 2001, Auton. Robots.

[44]  Masayuki Inaba,et al.  View-Based Approach to Robot Navigation , 2002 .