Overlay what Humanoid Robot Perceives and Thinks to the Real-world by Mixed Reality System

One of the problems in developing a humanoid robot is caused by the fact that intermediate results, such as what the robot perceives the environment, and how it plans its moving path are hard to be observed online in the physical environment. What developers can see is only the behavior. Therefore, they usually investigate logged data afterwards, to analyze how well each component worked, or which component was wrong in the total system. In this paper, we present a novel environment for robot development, in which intermediate results of the system are overlaid on physical space using mixed reality technology. Real-time observation enables the developers to see intuitively, in what situation the specific intermediate results are generated, and to understand how results of a component affected the total system. This feature makes the development efficient and precise. This environment also gives a human-robot interface that shows the robot internal state intuitively, not only in development, but also in operation.

[1]  Masayuki Inaba,et al.  The experimental humanoid robot H7: a research platform for autonomous behaviour , 2007, Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences.

[2]  Shumin Zhai,et al.  Applications of augmented reality for human-robot communication , 1993, Proceedings of 1993 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS '93).

[3]  Zhiwei Luo,et al.  Development of PC-based 3D dynamic human interactive robot simulator , 2003, Proceedings 2003 IEEE International Symposium on Computational Intelligence in Robotics and Automation. Computational Intelligence in Robotics and Automation for the New Millennium (Cat. No.03EX694).

[4]  A. Takagi,et al.  Development of a stereo video see-through HMD for AR systems , 2000, Proceedings IEEE and ACM International Symposium on Augmented Reality (ISAR 2000).

[5]  Fumio Kanehiro,et al.  Humanoid robot HRP-2 , 2008, IEEE International Conference on Robotics and Automation, 2004. Proceedings. ICRA '04. 2004.

[6]  Gudrun Klinker,et al.  FixIt: An Approach towards Assisting Workers in Diagnosing Machine Malfunctions , 2004, MIXER.

[7]  Steven K. Feiner,et al.  SenseShapes: using statistical geometry for object selection in a multimodal augmented reality , 2003, The Second IEEE and ACM International Symposium on Mixed and Augmented Reality, 2003. Proceedings..

[8]  Hideyuki Tamura,et al.  Robot vision-based registration utilizing bird's-eye view with user's view , 2003, The Second IEEE and ACM International Symposium on Mixed and Augmented Reality, 2003. Proceedings..

[9]  Masayuki Inaba,et al.  Online generation of humanoid walking motion based on a fast generation method of motion pattern that follows desired ZMP , 2002, IEEE/RSJ International Conference on Intelligent Robots and Systems.