Gaze-controlled driving

We investigate if the gaze (point of regard) can control a remote vehicle driving on a racing track. Five different input devices (on-screen buttons, mouse-pointing low-cost webcam eye tracker and two commercial eye tracking systems) provide heading and speed control on the scene view transmitted from the moving robot. Gaze control was found to be similar to mouse control. This suggests that robots and wheelchairs may be controlled "hands-free" through gaze. Low precision gaze tracking and image transmission delays had noticeable effect on performance.

[1]  José Luis Lázaro,et al.  Wheelchair for physically disabled people with voice, ultrasonic and infrared sensor control , 1995, Auton. Robots.

[2]  Inhyuk Moon,et al.  Intelligent robotic wheelchair with EMG-, gesture-, and voice-based interfaces , 2003, Proceedings 2003 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2003) (Cat. No.03CH37453).

[3]  W. Pruehsner,et al.  Vocal, motorized, and environmentally controlled chair , 1999, Proceedings of the IEEE 25th Annual Northeast Bioengineering Conference (Cat. No. 99CH36355).

[4]  Y. Matsumotot,et al.  Development of intelligent wheelchair system with face and gaze based interface , 2001, Proceedings 10th IEEE International Workshop on Robot and Human Interactive Communication. ROMAN 2001 (Cat. No.01TH8591).

[5]  Ahmad Lotfi,et al.  Remote control of mobile robots through human eye gaze: the design and evaluation of an interface , 2008, Security + Defence.