Mobile robot teleoperation through eye-gaze (telegaze)

In most teleoperation applications the human operator is required to monitor the status of the robot, as well as, issue controlling commands for the whole duration of the operation. Using a vision based feedback system, monitoring the robot requires the operator to look at a continuous stream of images displayed on an interaction screen. The eyes of the operator therefore, are fully engaged in monitoring and the hands in controlling. Since the eyes of the operator are engaged in monitoring anyway, inputs from their gaze can be used to aid in controlling. This frees the hands of the operator, either partially or fully, from controlling which can then be used to perform any other necessary tasks. However, the challenge here lies in distinguishing between the inputs that can be used for controlling and the inputs that can be used for monitoring. In mobile robot teleoperation, controlling is mainly composed of issuing locomotion commands to drive the robot. Monitoring on the other hand, is looking where the robot goes and looking for any obstacles in the route. Interestingly, there exist a strong correlation between human's gazing behaviours and their moving intentions. This correlation has been exploited in this thesis to investigate novel means for mobile robot teleoperation through eye-gaze, which has been named TeleGaze for short.

[1]  V. Alvarez-Cortes,et al.  Current Trends in Adaptive User Interfaces: Challenges and Applications , 2007, Electronics, Robotics and Automotive Mechanics Conference (CERMA 2007).

[2]  Thomas B. Sheridan,et al.  Telerobotics, Automation, and Human Supervisory Control , 2003 .

[3]  Jakob Nielsen,et al.  Enhancing the explanatory power of usability heuristics , 1994, CHI '94.

[4]  Huosheng Hu,et al.  Use of forehead bio-signals for controlling an Intelligent Wheelchair , 2009, 2008 IEEE International Conference on Robotics and Biomimetics.

[5]  M. Mazo,et al.  System for assisted mobility using eye movements based on electrooculography , 2002, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[6]  John Paulin Hansen,et al.  Gaze-controlled driving , 2009, CHI Extended Abstracts.

[7]  Emanuele Menegatti,et al.  A BCI Teleoperated Museum Robotic Guide , 2009, 2009 International Conference on Complex, Intelligent and Software Intensive Systems.

[8]  Jennifer G. Dy,et al.  A fuzzy logics clustering approach to computing human attention allocation using eyegaze movement cue , 2009, Int. J. Hum. Comput. Stud..

[9]  M. Stella Atkins,et al.  Eye gaze patterns differentiate novice and experts in a virtual laparoscopic surgery training environment , 2004, ETRA.

[10]  Howell O. Istance Communication through eye-gaze: where we have been, where we are now and where we can go from here , 2006, ETRA '06.

[11]  Chern-Sheng Lin,et al.  Powered Wheelchair Controlled by Eye-Tracking System , 2006 .

[12]  Alexander Zelinsky,et al.  Active gaze tracking for human-robot interaction , 2002, Proceedings. Fourth IEEE International Conference on Multimodal Interfaces.

[13]  Sandra P. Marshall,et al.  Eye tracking insights into cognitive modeling , 2006, ETRA.

[14]  Meike Jipp,et al.  The impact of individual differences on human information acquisition behavior to enhance gaze-based wheelchair control , 2008, 2008 IEEE International Conference on Systems, Man and Cybernetics.

[15]  Ulrich Nehmzow Mobile Robotics: A Practical Introduction , 2003 .

[16]  Grzegorz Cielniak,et al.  Active people recognition using thermal and grey images on a mobile security robot , 2005, 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[17]  Naoki Tanaka,et al.  3D gaze tracking with easy calibration using stereo cameras for robot and human communication , 2008, RO-MAN 2008 - The 17th IEEE International Symposium on Robot and Human Interactive Communication.

[18]  Kerstin Dautenhahn,et al.  Methodology & Themes of Human-Robot Interaction: A Growing Research Field , 2007 .

[19]  Xianzhong Dai,et al.  A Robust Person Tracking and Following Approach for Mobile Robot , 2007, 2007 International Conference on Mechatronics and Automation.

[20]  Hirotake Yamazoe,et al.  Gaze-communicative behavior of stuffed-toy robot with joint attention and eye contact based on ambient gaze-tracking , 2007, ICMI '07.

[21]  Graham I. Johnson,et al.  Evaluating usability of human computer interfaces: a practical method (west sussex , 1989 .

[22]  Jean Scholtz,et al.  Common metrics for human-robot interaction , 2006, HRI '06.

[23]  Hideaki Kuzuoka,et al.  Dual ecologies of robot as communication media: thoughts on coordinating orientations and projectability , 2004, CHI.

[24]  Andy P. Field,et al.  Discovering Statistics Using SPSS , 2000 .

[25]  Terrence Fong,et al.  Vehicle Teleoperation Interfaces , 2001, Auton. Robots.

[26]  Albrecht Schmidt,et al.  Interacting with the Computer Using Gaze Gestures , 2007, INTERACT.

[27]  Claudio Castellini,et al.  Gaze Tracking in Semi-Autonomous Grasping , 2008 .

[28]  Jay Chaeyong Yi FEATUREUser-research-driven mobile user interface innovation: a success story from Seoul , 2010, INTR.

[29]  Robert J. K. Jacob,et al.  Eye Movement-Based Human-Computer Interaction Techniques: Toward Non-Command Interfaces , 2003 .

[30]  Kohei Arai,et al.  Electric wheelchair control with gaze direction and eye blinking , 2009, Artificial Life and Robotics.

[31]  Tommy Strandvall,et al.  Eye Tracking in Human-Computer Interaction and Usability Research , 2009, INTERACT.

[32]  Vicente Mut,et al.  Autonomous and teleoperation control of a mobile robot , 2008 .

[33]  K. Kato,et al.  A implementation of Humanoid Vision -analysis of eye movement and implementation to robot- , 2007, SICE Annual Conference 2007.

[34]  Michael Lewis,et al.  Task-driven camera operations for robotic exploration , 2005, IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans.

[35]  Carlos Hitoshi Morimoto,et al.  GInX: gaze based interface extensions , 2008, ETRA.

[36]  John Paulin Hansen,et al.  All eyes on the monitor: gaze based interaction in zoomable, multi-scaled information-spaces , 2008, IUI '08.

[37]  Domenico Prattichizzo,et al.  Human-robotics interface for the interaction with cognitive and emotional human domains , 2007, 2007 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[38]  Xuan Zhang,et al.  Evaluating Eye Tracking with ISO 9241 - Part 9 , 2007, HCI.

[39]  Chung-Hsien Kuo,et al.  Eyeglasses based electrooculography human-wheelchair interface , 2009, 2009 IEEE International Conference on Systems, Man and Cybernetics.

[40]  Gary Witus,et al.  Experiments in augmented teleoperation for mobile robots: I , 2007, SPIE Defense + Commercial Sensing.

[41]  J. B. Brooke,et al.  SUS: A 'Quick and Dirty' Usability Scale , 1996 .

[42]  Avinash C. Kak,et al.  Person Tracking with a Mobile Robot using Two Uncalibrated Independently Moving Cameras , 2005, Proceedings of the 2005 IEEE International Conference on Robotics and Automation.

[43]  Di Zhao,et al.  A semi-autonomous teleoperation system based on robotic hand-eye coordination , 2008, 2008 IEEE International Conference on Mechatronics and Automation.

[44]  Huosheng Hu,et al.  EMG-based hands-free wheelchair control with EOG attention shift detection , 2007, 2007 IEEE International Conference on Robotics and Biomimetics (ROBIO).

[45]  Ray A. Jarvis,et al.  Multimodal Robot/Human Interaction in an Assistive Technology Context , 2009, 2009 Second International Conferences on Advances in Computer-Human Interactions.

[46]  Holly A. Yanco,et al.  Evolving interface design for robot search tasks , 2007, J. Field Robotics.

[47]  Hiroshi Mizoguchi,et al.  Person following mobile robot under varying illumination based on distance and color information , 2007, 2007 IEEE International Conference on Robotics and Biomimetics (ROBIO).

[48]  C. Galindo,et al.  Control Architecture for Human–Robot Integration: Application to a Robotic Wheelchair , 2006, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[49]  Robert J. K. Jacob,et al.  Evaluation of eye gaze interaction , 2000, CHI.

[50]  Laurel King,et al.  The relationship between scene and eye movements , 2002, Proceedings of the 35th Annual Hawaii International Conference on System Sciences.

[51]  Md. Ezharul Islam,et al.  Vision system for human-robot interface , 2008, 2008 11th International Conference on Computer and Information Technology.

[52]  Ray A. Jarvis,et al.  A multi-modal gesture recognition system in a Human-Robot Interaction scenario , 2009, 2009 IEEE International Workshop on Robotic and Sensors Environments.

[53]  Maja Pantic,et al.  Gaze-X: Adaptive, Affective, Multimodal Interface for Single-User Office Scenarios , 2007, Artifical Intelligence for Human Computing.

[54]  Jean Scholtz,et al.  Beyond usability evaluation: analysis of human-robot interaction at a major robotics competition , 2004 .

[55]  Michael A. Goodrich,et al.  Human-Robot Interaction: A Survey , 2008, Found. Trends Hum. Comput. Interact..

[56]  M. Vanrell,et al.  Gaze control in a binocular robot systems , 1999, 1999 7th IEEE International Conference on Emerging Technologies and Factory Automation. Proceedings ETFA '99 (Cat. No.99TH8467).

[57]  Päivi Majaranta,et al.  Twenty years of eye typing: systems and design issues , 2002, ETRA.

[58]  Ray A. Jarvis,et al.  A Go Where You Look Tele-autonomous Rough Terrain Mobile Robot , 2002, ISER.

[59]  S. Suzuki,et al.  Evaluation of Human Skill in Teleoperation System , 2006, 2006 SICE-ICASE International Joint Conference.

[60]  J. Geoffrey Chase,et al.  Human-Robot Collaboration: A Literature Review and Augmented Reality Approach in Design , 2008 .

[61]  Hirotake Yamazoe,et al.  GazeRoboard: Gaze-communicative guide system in daily life on stuffed-toy robot with interactive display board , 2008, 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[62]  Peter Brusilovsky,et al.  High-level translation of adaptive hypermedia applications , 2005, HYPERTEXT '05.

[63]  Oleg V. Komogortsev,et al.  An Effort Based Model of Software Usability , 2009 .

[64]  Jean Scholtz,et al.  Theory and evaluation of human robot interactions , 2003, 36th Annual Hawaii International Conference on System Sciences, 2003. Proceedings of the.

[65]  Dan E. Tamir,et al.  An effort and time based measure of usability , 2008, WoSQ '08.

[66]  S.R. Munasinghe,et al.  Controlling a Wheelchair by Use of EOG Signal , 2008, 2008 4th International Conference on Information and Automation for Sustainability.

[67]  John R. Anderson,et al.  Intelligent gaze-added interfaces , 2000, CHI.

[68]  Dan R. Olsen,et al.  Metrics for Evaluating Human-Robot Interactions , 2003 .

[69]  Tingting Xu,et al.  Information-based gaze control adaptation to scene context for mobile robots , 2008, 2008 19th International Conference on Pattern Recognition.

[70]  Jean Underwood,et al.  Visual attention while driving: sequences of eye fixations made by experienced and novice drivers , 2003, Ergonomics.

[71]  Phil Turner,et al.  Designing Interactive Systems: People, Activities, Contexts, Technologies , 2005 .

[72]  Yeow Kee Tan Eye gaze tracking and speech recognition for data entry and error recovery : a multimodal approach , 2004 .

[73]  Stefan Kohlbecher,et al.  Vision system for wearable and robotic uses , 2008, RO-MAN 2008 - The 17th IEEE International Symposium on Robot and Human Interactive Communication.

[74]  Guang-Zhong Yang,et al.  Gaze contingent articulated robot control for robot assisted minimally invasive surgery , 2008, 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[75]  Rainer Stiefelhagen,et al.  Multi-level Particle Filter Fusion of Features and Cues for Audio-Visual Person Tracking , 2007, CLEAR.

[76]  A. Zelinsky,et al.  Intuitive interface through active 3D gaze tracking , 2005, Proceedings of the 2005 International Conference on Active Media Technology, 2005. (AMT 2005)..

[77]  Brian P. Bailey,et al.  Understanding changes in mental workload during execution of goal-directed tasks and its application for interruption management , 2008, TCHI.

[78]  Samuel Kaski,et al.  Learning to learn implicit queries from gaze patterns , 2008, ICML '08.

[79]  Geoffrey M. Underwood,et al.  Knowledge-Based Patterns of Remembering: Eye Movement Scanpaths Reflect Domain Experience , 2008, USAB.

[80]  Kenji Itoh,et al.  Eye-Movement Analysis of Track Monitoring Patterns of Night Train Operators: Effects of Geographic Knowledge and Fatigue , 2000 .

[81]  Laura Chamberlain Eye Tracking Methodology; Theory and Practice , 2007 .

[82]  Robert J. K. Jacob,et al.  What you look at is what you get: eye movement-based interaction techniques , 1990, CHI '90.

[83]  Zygmunt Pizlo,et al.  A study on the effect of camera motion on human visual attention , 2008, 2008 15th IEEE International Conference on Image Processing.

[84]  Shumin Zhai,et al.  Keeping an eye for HCI , 1999, XII Brazilian Symposium on Computer Graphics and Image Processing (Cat. No.PR00481).

[85]  Andrew T Duchowski,et al.  A breadth-first survey of eye-tracking applications , 2002, Behavior research methods, instruments, & computers : a journal of the Psychonomic Society, Inc.

[86]  Kimon P. Valavanis,et al.  Vision Based Target Tracking and Collision Avoidance for Mobile Robots , 2007, J. Intell. Robotic Syst..

[87]  Paolo Dario,et al.  Prototype of a vision-based gaze-driven man-machine interface , 1995, Proceedings 1995 IEEE/RSJ International Conference on Intelligent Robots and Systems. Human Robot Interaction and Cooperative Robots.

[88]  Michael A. Goodrich,et al.  Seven principles of efficient human robot interaction , 2003, SMC'03 Conference Proceedings. 2003 IEEE International Conference on Systems, Man and Cybernetics. Conference Theme - System Security and Assurance (Cat. No.03CH37483).

[89]  J. A. Adams Multiple robot / single human interaction: effects on perceived workload , 2009, Behav. Inf. Technol..

[90]  Nurul Arif Setiawan,et al.  Multiple People Gesture Recognition for Human-Robot Interaction , 2007, HCI.

[91]  Yan Xiao,et al.  Using Eye-Tracking Video Data to Augment Knowledge Elicitation in Cognitive Task Analysis , 2001 .

[92]  Thorsten O. Zander,et al.  BC(eye): Combining Eye-Gaze Input with Brain-Computer Interaction , 2009, HCI.

[93]  P. Dario,et al.  Gaze interface: Utilizing human predictive gaze movements for controlling a HBS , 2008, 2008 2nd IEEE RAS & EMBS International Conference on Biomedical Robotics and Biomechatronics.

[94]  Kasia Muldner,et al.  Using Eye-Tracking Data for High-Level User Modeling in Adaptive Interfaces , 2007, AAAI.

[95]  Myung Jin Chung,et al.  A human-robot interface using vision-based eye gaze estimation system , 2002, IEEE/RSJ International Conference on Intelligent Robots and Systems.

[96]  Kanya Tanaka,et al.  Interactive interface with evolutionary eye sensing and physiological knowledge (特集 平成20年電気学会電子・情報・システム部門大会) , 2009 .

[97]  Kazuhiko Kawamura,et al.  Evaluation of an enhanced human-robot interface , 2003, SMC'03 Conference Proceedings. 2003 IEEE International Conference on Systems, Man and Cybernetics. Conference Theme - System Security and Assurance (Cat. No.03CH37483).

[98]  Julie A. Adams,et al.  Interface Evaluation for Mobile Robot Teleoperation , 2003 .

[99]  Takayuki Kanda,et al.  Robot behavior adaptation for human-robot interaction based on policy gradient reinforcement learning , 2005, 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[100]  Ulrich Nehmzow Robot Behaviour - Design, Description, Analysis and Modelling , 2008 .

[101]  N. Hari Narayanan,et al.  Comparing interfaces based on what users watch and do , 2000, ETRA.

[102]  Haruki Ueno,et al.  On Tracking of Eye for Human-Robot Interface , 2004, Int. J. Robotics Autom..

[103]  J.A. Piepmeier,et al.  Gaze Tracking Interface for Robotic Control , 2008, 2008 40th Southeastern Symposium on System Theory (SSST).

[104]  Alexander Zelinsky,et al.  Intuitive Human-Robot Interaction Through Active 3D Gaze Tracking , 2003, ISRR.

[105]  Dong-Soo Kwon,et al.  Integration of a Rehabilitation Robotic System (KARES II) with Human-Friendly Man-Machine Interaction Units , 2004, Auton. Robots.

[106]  Shumin Zhai,et al.  Manual and gaze input cascaded (MAGIC) pointing , 1999, CHI '99.

[107]  Jee-Hwan Ryu,et al.  A user study of command strategies for mobile robot teleoperation , 2009, Intell. Serv. Robotics.

[108]  Takayuki Kanda,et al.  Adapting Robot Behavior for Human--Robot Interaction , 2008, IEEE Transactions on Robotics.

[109]  Guang-Zhong Yang,et al.  Gaze-Contingent 3D Control for Focused Energy Ablation in Robotic Assisted Surgery , 2008, MICCAI.

[110]  Meike Jipp,et al.  Easing Wheelchair Control by Gaze-based Estimation of Intended Motion , 2008 .

[111]  Oleg V. Komogortsev,et al.  Input evaluation of an eye-gaze-guided interface: kalman filter vs. velocity threshold eye movement identification , 2009, EICS '09.

[112]  Arzu Çöltekin,et al.  Evaluating the Effectiveness of Interactive Map Interface Designs: A Case Study Integrating Usability Metrics with Eye-Movement Analysis , 2009 .

[113]  Manuel Mazo,et al.  Wheelchair Guidance Strategies Using EOG , 2002, J. Intell. Robotic Syst..

[114]  Chris Leger,et al.  Hands-Free Operation of a Small Mobile Robot , 2000, Auton. Robots.

[115]  J. Takeno,et al.  Development of a human interface for remote-controlled robots using an eye-tracking system , 2005, IEEE International Conference Mechatronics and Automation, 2005.

[116]  Bruce Davies,et al.  Fundamentals of Robotics: Linking Perception to Action , 2005 .

[117]  Alan D. Christiansen,et al.  Command and Control of Robot Teams , 2003 .