Natural Gaze Data-Driven Wheelchair

Natural eye movements during navigation have long been considered to reflect planning processes and link to user’s future action intention. We investigate here whether natural eye movements during joystick-based navigation of wheel-chairs follow identifiable patterns that are predictive of joystick actions. To place eye movements in context with driving intentions, we combine our eye tracking with a 3D depth camera system, which allows us to identify which eye movements have the floor as gaze target and distinguish them from other non-navigation related eye movements. We find consistent patterns of eye movements on the floor predictive of steering commands issued by the driver in all subjects. Based on this empirical data we developed two gaze decoders using supervised machine learning techniques and enabled each of these drivers to then steer the wheelchair by imagining they were using a joystick to trigger appropriate natural eye movements via motor imagery. We show that all subjects are able to navigate their wheelchair “by eye” learning it within a short time span of minutes. Our work shows that simple gaze-based decoding without need for artificial user interfaces suffices to restore mobility and increasing participation in daily life.

[1]  K. Mayer Fundamentals of surgical research course: research presentations. , 2005, The Journal of surgical research.

[2]  A. Aldo Faisal,et al.  A virtual reality platform for safe evaluation and training of natural gaze-based wheelchair driving , 2015, 2015 7th International IEEE/EMBS Conference on Neural Engineering (NER).

[3]  Andrew T Duchowski,et al.  A breadth-first survey of eye-tracking applications , 2002, Behavior research methods, instruments, & computers : a journal of the Psychonomic Society, Inc.

[4]  David J. Ward,et al.  Artificial intelligence: Fast hands-free writing by gaze direction , 2002, Nature.

[5]  Takahiro Higuchi,et al.  Perception of spatial requirements for wheelchair locomotion in experienced users with tetraplegia. , 2009, Journal of physiological anthropology.

[6]  Kohei Arai,et al.  A Prototype of ElectricWheelchair Controlled by Eye-Only for Paralyzed User , 2011, J. Robotics Mechatronics.

[7]  Paul A Orlov,et al.  The Effectiveness of Gaze-Contingent Control in Computer Games , 2015, Perception.

[8]  John Paulin Hansen,et al.  Evaluation of a low-cost open-source gaze tracker , 2010, ETRA.

[9]  A. Patla,et al.  “Look where you’re going!”: gaze behaviour associated with maintaining and changing the direction of locomotion , 2002, Experimental Brain Research.

[10]  R. Sudirman,et al.  Guiding Wheelchair Motion Based on EOG Signals Using Tangent Bug Algorithm , 2011, 2011 Third International Conference on Computational Intelligence, Modelling & Simulation.

[11]  Andrew W. Fitzgibbon,et al.  KinectFusion: real-time 3D reconstruction and interaction using a moving depth camera , 2011, UIST.

[12]  W W Abbott,et al.  Ultra-low-cost 3D gaze estimation: an intuitive high information throughput compliment to direct brain–machine interfaces , 2012, Journal of neural engineering.

[13]  M. Ghovanloo,et al.  The Tongue Enables Computer and Wheelchair Control for People with Spinal Cord Injury , 2013, Science Translational Medicine.

[14]  A. L. Yarbus,et al.  Eye Movements and Vision , 1967, Springer US.

[15]  C. Estrada,et al.  The 10-Minute Oral Presentation: What Should I Focus on? , 2005, The American journal of the medical sciences.

[16]  Mark H. Johnson,et al.  Eye contact detection in humans from birth , 2002, Proceedings of the National Academy of Sciences of the United States of America.

[17]  Markku Tukiainen,et al.  Gaze behaviour of expert and novice microneurosurgeons differs during observations of tumor removal recordings , 2012, ETRA '12.

[18]  Chris Fullwood,et al.  Effect of gazing at the camera during a video link on recall. , 2006, Applied ergonomics.

[19]  William W. Abbott,et al.  3D gaze cursor: Continuous calibration and end-point grasp control of robotic actuators , 2016, 2016 IEEE International Conference on Robotics and Automation (ICRA).

[20]  R. Fernald Evolution of eyes , 2000, Current Opinion in Neurobiology.

[21]  Peter König,et al.  Human eye-head co-ordination in natural exploration , 2007, Network.

[22]  P. König,et al.  Gaze allocation in natural stimuli: Comparing free exploration to head-fixed viewing conditions , 2009 .

[23]  Wolfgang Einhäuser,et al.  Attention in natural scenes: contrast affects rapid visual processing and fixations alike , 2013, Philosophical Transactions of the Royal Society B: Biological Sciences.

[24]  S. Wu,et al.  Adler's Physiology of the Eye , 2002 .

[25]  H. Späth,et al.  Fitting affine and orthogonal transformations between two sets of points , 2004 .

[26]  William W. Abbott,et al.  Gaze-based teleprosthetic enables intuitive continuous control of complex robot arm use: Writing & drawing , 2016, 2016 6th IEEE International Conference on Biomedical Robotics and Biomechatronics (BioRob).

[27]  Wolfgang Einhäuser,et al.  Mind the step: complementary effects of an implicit task on eye and head movements in real-life gaze allocation , 2012, Experimental Brain Research.

[28]  Meike Jipp,et al.  Easing Wheelchair Control by Gaze-based Estimation of Intended Motion , 2008 .

[29]  Mary M Hayhoe,et al.  Task and context determine where you look. , 2016, Journal of vision.

[30]  D. Venkataraman,et al.  Eye movement based electronic wheel chair for physically challenged persons , 2014 .

[31]  A. L. I︠A︡rbus Eye Movements and Vision , 1967 .

[32]  Matthew E. Taylor,et al.  Towards a Semi-Autonomous Wheelchair for Users with ALS , 2016 .

[33]  Takahiro Higuchi,et al.  Gaze behavior during locomotion through apertures: the effect of locomotion forms. , 2009, Human movement science.

[34]  Erik Wästlund,et al.  What you see is where you go: testing a gaze-driven power wheelchair for individuals with severe multiple disabilities , 2010, ETRA.

[35]  Lisbeth Nilsson,et al.  Driving to learn: a new concept for training children with profound cognitive disabilities in a powered wheelchair. , 2003, The American journal of occupational therapy : official publication of the American Occupational Therapy Association.

[36]  John Paulin Hansen,et al.  Gaze-controlled driving , 2009, CHI Extended Abstracts.

[37]  David J. Ward,et al.  Fast Hands-free Writing by Gaze Direction , 2002, ArXiv.

[38]  Andrew T. Duchowski,et al.  Eye Tracking Methodology: Theory and Practice , 2003, Springer London.

[39]  Dieter Fox,et al.  RGB-D Mapping: Using Depth Cameras for Dense 3D Modeling of Indoor Environments , 2010, ISER.

[40]  Jennifer L. O'Brien,et al.  Associating reward and loss with faces: Effects on rapid face recognition , 2010 .

[41]  T. Higuchi Visuomotor Control of Human Adaptive Locomotion: Understanding the Anticipatory Nature , 2013, Front. Psychol..

[42]  Myung Jin Chung,et al.  Real-time plane detection based on depth map from Kinect , 2013, IEEE ISR 2013.

[43]  Peter König,et al.  Eye–Head Coordination during Free Exploration in Human and Cat , 2009, Annals of the New York Academy of Sciences.

[44]  Mark H. Johnson,et al.  Mechanisms of Eye Gaze Perception during Infancy , 2004, Journal of Cognitive Neuroscience.