Eye movement based electronic wheel chair for physically challenged persons

A powered wheel chair is a mobility-aided device for persons with moderate/severe physical disabilities or chronic diseases as well as the elderly. In order to take care for different disabilities, various kinds of interfaces have been developed for powered wheelchair control; such as joystick control, head control and sip-puff control. Many people with disabilities do not have the ability to control powered wheel chair using the above mentioned interfaces. The proposed model is a possible alternative. In this paper, we use the optical-type eye tracking system to control powered wheel chair. User‘s eye movement are translated to screen position using the optical type eye tracking system. When user looks at appropriate angle, then computer input system will send command to the software based on the angle of rotation of pupil i.e., when user moves his eyes balls up (move forward), left (move left), right (move right) in all other cases wheel chair will stop. Once the image has been processed it moves onto the second part, our microprocessor. The microprocessor will take a USB output from the laptop and convert the signal into signals that will be sent to the wheelchair wheels for movement. Also, the pressure and object detection sensors will be connected to our microprocessor to provide necessary feedback for proper operation of the wheelchair system. The final part of the project is the wheelchair itself. The rear wheels will provide forward. The front two wheels will be used for steering left and right. All four wheels will be connected to our microprocessor that will send signals to control the wheels and thus the overall movement.

[1]  K. T. V. Grattan,et al.  Communication by Eye Closure-A Microcomputer-Based System for the Disabled , 1986, IEEE Transactions on Biomedical Engineering.

[2]  A. Rosenfeld,et al.  IEEE TRANSACTIONS ON SYSTEMS , MAN , AND CYBERNETICS , 2022 .

[3]  Manuel Mazo,et al.  Wheelchair Guidance Strategies Using EOG , 2002, J. Intell. Robotic Syst..

[4]  Sungho Jo,et al.  Electric wheelchair control using head pose free eye-gaze tracker , 2012 .

[5]  H. Lüders,et al.  American Electroencephalographic Society Guidelines for Standard Electrode Position Nomenclature , 1991, Journal of clinical neurophysiology : official publication of the American Electroencephalographic Society.

[6]  Rory A. Cooper,et al.  Intelligent control of power wheelchairs , 1995 .

[7]  Robert J. K. Jacob,et al.  Evaluation of eye gaze interaction , 2000, CHI.

[8]  Niels Henze,et al.  Gesture recognition with a Wii controller , 2008, TEI.

[9]  Kohei Arai,et al.  Electric wheelchair control with gaze direction and eye blinking , 2009, Artificial Life and Robotics.

[10]  Sankar K. Pal,et al.  Neuro-Fuzzy Pattern Recognition: Methods in Soft Computing , 1999 .

[11]  Paul Lukowicz,et al.  Gesture spotting with body-worn inertial sensors to detect user activities , 2008, Pattern Recognit..

[12]  Lawrence R. Rabiner,et al.  A tutorial on hidden Markov models and selected applications in speech recognition , 1989, Proc. IEEE.

[13]  Zhen Wang,et al.  uWave: Accelerometer-based Personalized Gesture Recognition and Its Applications , 2009, PerCom.

[14]  Leon M. Hall,et al.  Special Functions , 1998 .

[15]  Andrés Úbeda,et al.  Assistive robot application based on an RFID control architecture and a wireless EOG interface , 2012, Robotics Auton. Syst..

[16]  Wyatt S. Newman,et al.  A human-robot interface based on electrooculography , 2004, IEEE International Conference on Robotics and Automation, 2004. Proceedings. ICRA '04. 2004.

[17]  John Daugman,et al.  How iris recognition works , 2002, IEEE Transactions on Circuits and Systems for Video Technology.