Object Acquisition and Selection in Human Computer Interaction Systems: A Review

Object acquisition and selection are two important functions performed in most of the human computer interaction (HCI) systems. Various techniques are devised by the researchers to perform these operations and the selection of a combination of object acquisition and selection techniques for a particular HCI system has become a research issue, especially, when the user of these systems are differently abled persons. This paper presents a review on object acquisition and selection techniques used in HCI systems. It starts with the introduction to HCI systems, gives an overview on object acquisition & selection techniques, feedback modes, performing mouse analogous actions using eye blinks, the applications of HCI systems, and finally discusses challenges and issues related to these techniques.

[1]  Marco Porta,et al.  Eye-S: a full-screen input modality for pure eye-based communication , 2008, ETRA.

[2]  I. Scott BlinkWrite: efficient text entry using eye blinks , 2011 .

[3]  Ryen W. White,et al.  No clicks, no problem: using cursor movements to understand and improve search , 2011, CHI.

[4]  Ayman AbuBaker,et al.  EEG Mouse:A Machine Learning-Based Brain Computer Interface , 2014 .

[5]  Armando Barreto,et al.  Electromyogram-Based Cursor Control System for Users with Motor Disabilities , 2006, ICCHP.

[6]  Patrick Langdon,et al.  A new input system for disabled users involving eye gaze tracker and scanning interface , 2011 .

[7]  Andrew T. Duchowski,et al.  Gaze-based interaction: A 30 year retrospective , 2018, Comput. Graph..

[8]  Hongbin Zha,et al.  Improving eye cursor's stability for eye pointing tasks , 2008, CHI.

[9]  Boris M. Velichkovsky,et al.  Influences of dwell time and cursor control on the performance in gaze driven typing , 2008 .

[10]  Thorsten O. Zander,et al.  Combining Eye Gaze Input With a Brain–Computer Interface for Touchless Human–Computer Interaction , 2010, Int. J. Hum. Comput. Interact..

[11]  M Adjouadi,et al.  A practical EMG-based human-computer interface for users with motor disabilities. , 2000, Journal of rehabilitation research and development.

[12]  Bertram E. Shi,et al.  Improving Gaze-based Selection using Variable Dwell Time , 2017, ArXiv.

[13]  Jaswinder Singh,et al.  Real-time eye blink and wink detection for object selection in HCI systems , 2018, Journal on Multimodal User Interfaces.

[14]  Rajesh Singla,et al.  EOG and EMG based virtual keyboard: A brain-computer interface , 2009, 2009 2nd IEEE International Conference on Computer Science and Information Technology.

[15]  Kazuhiko Takahashi,et al.  Hands-Free Manipulation Using Simple Bio-Potential Interface System , 2007 .

[16]  Jarmo Verho,et al.  The effect of clicking by smiling on the accuracy of head-mounted gaze tracking , 2012, ETRA '12.

[17]  R. Banaeeyan Review on Issues of Eye Gaze Tracking Systems for Human Computer Interaction , 2014 .

[18]  Assit. Prof. Aree A. Mohammed,et al.  Efficient Eye Blink Detection Method for disabled- helping domain , 2014 .

[19]  Jürgen Beyerer,et al.  Comparing mouse and MAGIC pointing for moving target acquisition , 2014, ETRA.

[20]  Min Lin,et al.  A wireless EOG-based Human Computer Interface , 2010, 2010 3rd International Conference on Biomedical Engineering and Informatics.

[21]  Yoshinobu Ebisawa,et al.  PupilMouse supported by head pose detection , 2008, 2008 IEEE Conference on Virtual Environments, Human-Computer Interfaces and Measurement Systems.

[22]  James Gips,et al.  Direct Control of the Computer Through Electrodes Placed Around the Eyes , 1993, HCI.

[23]  Roel Vertegaal A Fitts Law comparison of eye tracking and manual input in the selection of visual targets , 2008, ICMI '08.

[24]  N.Y. Khan,et al.  Controlling Mouse through Eyes , 2007, 2007 International Conference on Emerging Technologies.

[25]  Marcel Tresanchez,et al.  Using the Optical Flow to Implement a Relative Virtual Mouse Controlled by Head Movements , 2008, J. Univers. Comput. Sci..

[26]  M. Betke,et al.  The Camera Mouse: visual tracking of body features to provide computer access for people with severe disabilities , 2002, IEEE Transactions on Neural Systems and Rehabilitation Engineering.

[27]  Margrit Betke,et al.  Communication via eye blinks and eyebrow raises: video-based human-computer interfaces , 2003, Universal Access in the Information Society.

[28]  Anke Huckauf,et al.  On object selection in gaze controlled environments , 2008 .

[29]  Sayan Sarcar,et al.  EyeK: an efficient dwell-free eye gaze-based text entry system , 2013, APCHI.

[30]  Timothy Forbes Mouse HCI Through Combined EMG and IMU , 2013 .

[31]  Marcel Tresanchez,et al.  Implementation of a robust absolute virtual head mouse combining face detection, template matching and optical flow algorithms , 2013, Telecommun. Syst..

[32]  Anke Huckauf,et al.  Gazing with pEYEs: towards a universal input for various applications , 2008, ETRA.

[33]  Marco Porta,et al.  Eye-based user interfaces: Some recent projects , 2010, 3rd International Conference on Human System Interaction.

[34]  Mayur V . Gore,et al.  Human Computer Interaction using Hand Gesture Recognition , 2014 .

[35]  Oleg Špakov,et al.  AN ALGORITHM FOR ADJUSTABLE DWELL TIME IN EYE TYPING SYSTEMS , 2004 .

[36]  Begoña García Zapirain,et al.  Eye/Head Tracking Technology to Improve HCI with iPad Applications , 2015, Sensors.

[37]  I. Scott MacKenzie,et al.  BlinkWrite: efficient text entry using eye blinks , 2011, Universal Access in the Information Society.

[38]  I. Scott MacKenzie,et al.  BlinkWrite2: an improved text entry method using eye blinks , 2010, ETRA '10.

[39]  Sagar R. Chavan,et al.  Augmented Reality vs. Virtual Reality: Differences and Similarities , 2016 .

[40]  Emanuel Peres,et al.  A Myographic-based HCI Solution Proposal for Upper Limb Amputees , 2016 .

[41]  Arthur Prochazka,et al.  Tooth-Click Control of a Hands-Free Computer Interface , 2008, IEEE Transactions on Biomedical Engineering.

[42]  Xiaoyu Zhao,et al.  Typing with eye-gaze and tooth-clicks , 2012, ETRA.

[43]  Oleg Spakov,et al.  On-line adjustment of dwell time for target selection by gaze , 2004, NordiCHI '04.

[44]  Saurabh Singh,et al.  Cursor Control using Hand Gestures , 2016 .

[45]  Masood Mehmood Khan,et al.  Portable tongue-supported human computer interaction system design and implementation , 2014, 2014 36th Annual International Conference of the IEEE Engineering in Medicine and Biology Society.

[46]  Carlos Tejada,et al.  Bitey: an exploration of tooth click gestures for hands-free user interface control , 2016, MobileHCI.

[47]  Cheng Zhang,et al.  An Eye-Gaze Tracking and Human Computer Interface System for People with ALS and other Locked-in Diseases , 2012 .

[48]  Margrit Betke,et al.  Blink and wink detection for mouse pointer control , 2010, PETRA '10.

[49]  Damras Wongsawang,et al.  Face and Eyes mouse for ALS Patients , 2016, 2016 Fifth ICT International Student Project Conference (ICT-ISPC).

[50]  Boris M. Velichkovsky,et al.  Eye typing in application: A comparison of two systems with ALS patients , 2008 .

[51]  Poika Isokoski,et al.  Now Dasher! Dash away!: longitudinal study of fast text entry by Eye Gaze , 2008, ETRA.

[52]  Anke Huckauf,et al.  Object selection in gaze controlled systems: What you don't look at is what you get , 2011, TAP.

[53]  G. Murugesan,et al.  A novel approach for Human Computer Interface based on eye movements for disabled people , 2015, 2015 IEEE International Conference on Electrical, Computer and Communication Technologies (ICECCT).

[54]  Rupal Khilari Iris tracking and blink detection for human-computer interaction using a low resolution webcam , 2010, ICVGIP '10.

[55]  Radu Gabriel Bozomitu,et al.  Detection of gaze direction by using improved eye-tracking technique , 2014, Proceedings of the 2014 37th International Spring Seminar on Electronics Technology.

[56]  I. Scott MacKenzie,et al.  Effects of feedback and dwell time on eye typing speed and accuracy , 2006, Universal Access in the Information Society.

[57]  Kwang Suk Park,et al.  A Novel Wearable Forehead EOG Measurement System for Human Computer Interfaces , 2017, Sensors.

[58]  Kyung S. Park,et al.  Eye-controlled human/computer interface using the line-of-sight and the international blink , 1996 .

[59]  Anke Huckauf,et al.  Alternatives to single character entry and dwell time selection on eye typing , 2010, ETRA.

[60]  Shir-Kuan Lin,et al.  Design of virtual keyboard using blink control method for the severely disabled , 2013, Comput. Methods Programs Biomed..

[61]  I. Scott MacKenzie,et al.  Eye typing using word and letter prediction and a fixation algorithm , 2008, ETRA.

[62]  Zhu Hao,et al.  Vision-Based Interface: Using Face and Eye Blinking Tracking with Camera , 2008, 2008 Second International Symposium on Intelligent Information Technology Application.

[63]  Robert J. K. Jacob,et al.  Evaluation of eye gaze interaction , 2000, CHI.

[64]  David Rozado Mouse and Keyboard Cursor Warping to Accelerate and Reduce the Effort of Routine HCI Input Tasks , 2013, IEEE Transactions on Human-Machine Systems.

[65]  Suman Deb,et al.  An augmented human computer interaction system by effective use of eye blink pattern , 2013 .

[66]  Andreas Paepcke,et al.  EyePoint: practical pointing and selection using gaze and keyboard , 2007, CHI.

[67]  Pradipta Biswas,et al.  A new interaction technique involving eye gaze tracker and scanning system , 2013, ETSA '13.

[68]  Shinjiro Kawato,et al.  Just blink your eyes: a head-free gaze tracking system , 2003, CHI Extended Abstracts.

[69]  Poika Isokoski,et al.  Gazing and frowning as a new human--computer interaction technique , 2004, TAP.

[70]  Ferran Argelaguet,et al.  A survey of 3D object selection techniques for virtual environments , 2013, Comput. Graph..

[71]  Kohei Arai,et al.  Eye-based HCI with Full Specification of Mouse and Keyboard Using Pupil Knowledge in the Gaze Estimation , 2011, 2011 Eighth International Conference on Information Technology: New Generations.

[72]  Fabio Babiloni,et al.  On the Use of Electrooculogram for Efficient Human Computer Interfaces , 2009, Comput. Intell. Neurosci..

[73]  Horatiu-Stefan Grif,et al.  Mouse Cursor Control System Based on Hand Gesture , 2016 .

[74]  Armando Barreto,et al.  Integrated electromyogram and eye-gaze tracking cursor control system for computer users with motor disabilities. , 2008, Journal of rehabilitation research and development.

[75]  Girijesh Prasad,et al.  Enhancing an Eye-Tracker based Human-Computer Interface with Multi-modal Accessibility Applied for Text Entry , 2015 .

[76]  Wyatt S. Newman,et al.  A human-robot interface based on electrooculography , 2004, IEEE International Conference on Robotics and Automation, 2004. Proceedings. ICRA '04. 2004.

[77]  Martin Hachet,et al.  Interacting with Spatial Augmented Reality , 2016 .

[78]  Oleg Spakov,et al.  Comparison of video-based pointing and selection techniques for hands-free text entry , 2012, AVI.

[79]  Albrecht Schmidt,et al.  Interacting with the Computer Using Gaze Gestures , 2007, INTERACT.

[80]  Takehiko Ohno Features of eye gaze interface for selection tasks , 1998, Proceedings. 3rd Asia Pacific Computer Human Interaction (Cat. No.98EX110).

[81]  Junichi Hori,et al.  Development of EOG-Based Communication System Controlled by Eight-Directional Eye Movements , 2006, 2006 International Conference of the IEEE Engineering in Medicine and Biology Society.

[82]  Suree Pumrin,et al.  Face and eye tracking for controlling computer functions , 2014, 2014 11th International Conference on Electrical Engineering/Electronics, Computer, Telecommunications and Information Technology (ECTI-CON).

[83]  Howell O. Istance,et al.  Designing gaze gestures for gaming: an investigation of performance , 2010, ETRA.

[84]  V. Prabhakaran,et al.  A Portable Wireless Human Computer Interface For Physically Challenged People , 2013 .