Development of Air Hockey Robot improving with the human players

This paper presents a developed Air Hockey Robot system, comprising camera unit, processing unit, and arm unit. This system performs as (1) detecting position of an Air Hockey puck from images captured using a ceil camera, (2) generating a 2-axis robot arm trajectory and (3) controlling the arm in order to shoot the puck to an opponent human player. Thus, the processing unit needs to execute three processes, i.e., image processing, trajectory generation, and arm control process. In this paper, we pay the most attention to trajectory generation process for improving the robot offensive performance. We give the robot a skill as attacking a weak point of human vision mechanism with respect to trajectory generation of the robot arm when it shoots stationary puck. In other words, the human player needs to change gaze points between the robot hand and the moving puck because the human can observe any object finely only in their central field of view, corresponding to around a fovea in spite of their quite wide field of view. Thus, we intend to plan the arm trajectory generation and control the arm as more easily the human players has a mistake caused by a time delay occurring when the human players switch their gaze point. Verification experiments, of changing speed of the puck shot from the same arm trajectory as the human players' mistake is caused easily, have been conducted using the developed Air Hockey Robot system.

[1]  Toshiyuki Gotoh,et al.  An algorithm for model-based stable pupil detection for eye tracking system , 2004, Systems and Computers in Japan.

[2]  Toshiyuki Gotoh,et al.  An algorithm for an eye tracking system with self-calibration , 2002, Systems and Computers in Japan.

[3]  Katsushi Ikeuchi,et al.  Toward an assembly plan from observation. I. Task recognition with polyhedral objects , 1994, IEEE Trans. Robotics Autom..

[4]  Arzu Çöltekin,et al.  Foveated gaze-contingent displays for peripheral LOD management, 3D visualization, and stereo imaging , 2007, TOMCCAP.

[5]  Akio Namiki,et al.  2A1-F10 Judgment of Game Circumstances and a Catching Motion in Air Hockey Robot System(Robots for Amusement and Entertainment) , 2011 .

[6]  Katsushi Ikeuchi,et al.  Towards An Assembly Plan From Observation: Part II: Correction Of Motion parameters Based On Fact Contact Constraints , 1992, Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems.

[7]  S. Shimizu Wide-Angle Foveation for All-Purpose Use , 2008, IEEE/ASME Transactions on Mechatronics.

[8]  Takumi Hashizume,et al.  Classification of gaze preference decision for human-machine interaction using eye tracking device , 2012, Int. J. Mechatronics Autom..

[9]  Sota Shimizu,et al.  Vision Sensor with Wide Angle and High Distortion Lens , 1995, Proceedings of 1995 IEEE International Conference on Robotics and Automation.

[10]  Wilson S. Geisler,et al.  Gaze-contingent real-time simulation of arbitrary visual fields , 2002, IS&T/SPIE Electronic Imaging.

[11]  Akio Namiki,et al.  2A1-G01 Offense Motion Algorithm in Air Hockey Robot System(Robots for Amusement and Entertainment) , 2011 .

[12]  Mitsuru Ishizuka,et al.  AutoSelect: What You Want Is What You Get: Real-Time Processing of Visual Attention and Affect , 2006, PIT.