Mobile Robot Navigation using Gaze Contingent Dynamic Interface

Using eyes as an input modality for different control environments is a great area of interest for enhancing the bandwidth of human machine interaction and providing interaction functions when the use of hands is not possible. Interface design requirements in such implementations are quite different from conventional application areas. Both command-execution and feedback observation tasks may be performed by human eyes simultaneously. In order to control the motion of a mobile robot by operator gaze interaction, gaze contingent regions in the operator interface are used to execute robot movement commands, with different screen areas controlling specific directions. Dwell time is one of the most established techniques to perform an eye-click analogous to a mouse click. But repeated dwell time while switching between gaze-contingent regions and feedback-regions decreases the performance of the application. We have developed a dynamic gaze-contingent interface in which we merge gaze-contingent regions with feedback-regions dynamically. This technique has two advantages: Firstly it improves the overall performance of the system by eliminating repeated dwell time. Secondly it reduces fatigue of the operator by providing a bigger area to fixate in. The operator can monitor feedback with more ease while sending commands at the same time.

[1]  Guang-Zhong Yang,et al.  Gaze contingent articulated robot control for robot assisted minimally invasive surgery , 2008, 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[2]  Shumin Zhai,et al.  Manual and gaze input cascaded (MAGIC) pointing , 1999, CHI '99.

[3]  Abujawad Rafid Siddiqui A Vision and Differential Steering System for a Mobile Robot Platform , 2010 .

[4]  Poika Isokoski,et al.  Manual Text Input: Experiments, Models, and Systems , 2004 .

[5]  Chern-Sheng Lin,et al.  Powered Wheelchair Controlled by Eye-Tracking System , 2006 .

[6]  Alexander Zelinsky,et al.  Active gaze tracking for human-robot interaction , 2002, Proceedings. Fourth IEEE International Conference on Multimodal Interfaces.

[7]  I. S. Mackenzie,et al.  Virtual Environments and Advanced Interface Design , 1995 .

[8]  M. Friedman Eyetracker communication system , 1983 .

[9]  H H Koester,et al.  Learning and performance of able-bodied individuals using scanning systems with and without word prediction. , 1994, Assistive technology : the official journal of RESNA.

[10]  Vijay Kumar,et al.  Human robot interaction and usability studies for a smart wheelchair , 2003, Proceedings 2003 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2003) (Cat. No.03CH37453).

[11]  Katherine M. Tsui,et al.  Performance Evaluation Methods for Assistive Robotic Technology , 2009 .

[12]  M. Mazo,et al.  System for assisted mobility using eye movements based on electrooculography , 2002, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[13]  Andreas Paepcke,et al.  Improving the accuracy of gaze input for interaction , 2008, ETRA.

[14]  K. Jellinger The Moving Tablet of the Eye: The Origins of Modern Eye Movement Research , 2006 .

[15]  Goran Trajkovski On Future Work , 2007 .

[16]  Robert L. Nord,et al.  Applied Software Architecture , 1999, Addison Wesley object technology series.

[17]  C. Ware,et al.  An evaluation of an eye tracker as a device for computer input2 , 1987, CHI '87.

[18]  Gordana Dodig Crnkovic,et al.  Constructive Research and Info-computational Knowledge Generation , 2010 .

[19]  Haruki Ueno,et al.  On Tracking of Eye for Human-Robot Interface , 2004, Int. J. Robotics Autom..

[20]  Andrew T. Duchowski,et al.  Eye Tracking Methodology: Theory and Practice , 2003, Springer London.

[21]  Miguel P Caldas,et al.  Research design: qualitative, quantitative, and mixed methods approaches , 2003 .

[22]  N. Sherkat,et al.  TeleGaze: Teleoperation through eye gaze , 2008, 2008 7th IEEE International Conference on Cybernetic Intelligent Systems.

[23]  Myung Jin Chung,et al.  A human-robot interface using vision-based eye gaze estimation system , 2002, IEEE/RSJ International Conference on Intelligent Robots and Systems.

[24]  David Beymer,et al.  What ’ s in the EYES for Attentive Input , 2003 .

[25]  Michael R. M. Jenkin,et al.  Computational principles of mobile robotics , 2000 .

[26]  Francis K. H. Quek Eyes in the interface , 1995, Image Vis. Comput..

[27]  Claes Wohlin,et al.  Experimentation in software engineering: an introduction , 2000 .

[28]  Chris Lankford Effective eye-gaze input into Windows , 2000, ETRA.

[29]  M. Yamada,et al.  Eye word processor (EWP) and peripheral controller for the ALS patient , 1987 .