Implicit human intention inference through gaze cues for people with limited motion ability

The promising assistive technologies bring the hope that enlightens the independent daily living for the elderly and disabled people. However, most modern human-machine communication means is not affordable to those people with very limited motion ability to effectively express their service requests. In the paper, we presented a novel interaction framework which can facilitate the communication between human and assistive devices. In the framework, human intention is inferred implicitly by monitoring the gaze movements. The advantage of this framework is that gaze-based communication requires very little effort from the user and most elderly and disabled people with motion impairment retain the visual capability. The architecture of the presented framework and its effectiveness were introduced and validated. The relationship between human intentions with gaze behaviors was further discussed. This work is expected to simplify the human-machine interaction, consequently enhancing the adoption of assistive technologies and the user's independence in daily living.

[1]  Carlos Hitoshi Morimoto,et al.  Eye gaze tracking techniques for interactive applications , 2005, Comput. Vis. Image Underst..

[2]  Richard Wright,et al.  The Vocal Joystick: A Voice-Based Human-Computer Interface for Individuals with Motor Impairments , 2005, HLT.

[3]  Muhammad Imran Shahzad,et al.  Control of Articulated Robot Arm by Eye Tracking , 2010 .

[4]  Alexander Zelinsky,et al.  Intuitive Human-Robot Interaction Through Active 3D Gaze Tracking , 2003, ISRR.

[5]  Kenji Kawashima,et al.  Development of a Master Slave System with Force Sensing Using Pneumatic Servo System for Laparoscopic Surgery , 2007, Proceedings 2007 IEEE International Conference on Robotics and Automation.

[6]  Carlos Hitoshi Morimoto,et al.  Pupil detection and tracking using multiple light sources , 2000, Image Vis. Comput..

[7]  F. Jatene,et al.  Robotic versus human camera holding in video-assisted thoracic sympathectomy: a single blind randomized trial of efficacy and safety. , 2008, Interactive cardiovascular and thoracic surgery.

[8]  Mike Topping,et al.  An overview of the development of Handy 1, a rehabilitation robot to assist the severely disabled , 2000, Artificial Life and Robotics.

[9]  Brigitte Meillon,et al.  Design and evaluation of a smart home voice interface for the elderly: acceptability and objection aspects , 2011, Personal and Ubiquitous Computing.

[10]  J Merchant,et al.  Remote measurement of eye direction allowing subject motion over one cubic foot of space. , 1974, IEEE transactions on bio-medical engineering.

[11]  Minho Lee,et al.  Intention Recognition and Object Recommendation System using Deep Auto-encoder Based Affordance Model , 2013 .

[12]  Jeff A. Bilmes,et al.  The VoiceBot: a voice controlled robot arm , 2009, CHI.

[13]  Zhiwei Zhu,et al.  Eye and gaze tracking for interactive graphic display , 2002, SMARTGRAPH '02.

[14]  J. Gilbert The EndoAssist robotic camera holder as an aid to the introduction of laparoscopic colorectal surgery. , 2009, Annals of the Royal College of Surgeons of England.

[15]  Csaba Antonya,et al.  Attentive User Interface for Interaction within Virtual Reality Environments Based on Gaze Analysis , 2011, HCI.

[16]  Vincenzo Lippiello,et al.  Human-robot interaction control using force and vision , 2007 .

[17]  Chern-Sheng Lin,et al.  Powered Wheelchair Controlled by Eye-Tracking System , 2006 .

[18]  Gourab Sen Gupta,et al.  Trial & experimentation of a smart home monitoring system for elderly , 2011, 2011 IEEE International Instrumentation and Measurement Technology Conference.

[19]  G. Ballantyne Robotic surgery, telerobotic surgery, telepresence, and telementoring , 2002, Surgical Endoscopy And Other Interventional Techniques.

[20]  J. Stolzenburg,et al.  Comparison of the FreeHand® robotic camera holder with human assistants during endoscopic extraperitoneal radical prostatectomy , 2011, BJU international.

[21]  Nicolas Y. Masse,et al.  Reach and grasp by people with tetraplegia using a neurally controlled robotic arm , 2012, Nature.

[22]  Koby Crammer,et al.  On the Algorithmic Implementation of Multiclass Kernel-based Vector Machines , 2002, J. Mach. Learn. Res..

[23]  Guillaume Doisy Sensorless collision detection and control by physical interaction for wheeled mobile robots , 2012, 2012 7th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[24]  Manuel Mazo,et al.  Electro-Oculographic Guidance of a Wheelchair Using Eye Movements Codification , 2003, Int. J. Robotics Res..

[25]  Satoshi Kagami,et al.  Motion Control System that Realizes Physical Interaction between Robot's Hands and Environment during Walk , 2006, 2006 6th IEEE-RAS International Conference on Humanoid Robots.

[26]  Desney S. Tan,et al.  Demonstrating the feasibility of using forearm electromyography for muscle-computer interfaces , 2008, CHI.

[27]  Desney S. Tan,et al.  Making muscle-computer interfaces more practical , 2010, CHI.

[28]  Dave M. Stampe,et al.  Heuristic filtering and reliable calibration methods for video-based pupil-tracking systems , 1993 .

[29]  Chih-Jen Lin,et al.  LIBSVM: A library for support vector machines , 2011, TIST.