Assistive Pointing Device Based on a Head-Mounted Camera

This paper introduces and validates the performance of an alternative input device for people with limited hand/arm movement and control. A summary is provided of the current state of the art in alternative input devices. Based on this, a low-cost solution is proposed that allows for 1) low latency and high operating speed and accuracy, 2) use from different viewing angles without recalibration, and 3) the ability to seamlessly control multiple devices. A prototype of this system was built and tested to determine the accuracy of the system (using a pan–tilt system), and to analyze the performance of the system compared to the state of the art (through user tests, based on the ISO 9241–411 test). The proposed system allows for great accuracy ( $\sigma _X=\text{0.28 px}, \sigma _Y=\text{0.29 px}, \sigma _{XY}=\text{0.02 px}$), a decent performance compared to the state of the art (throughput = 1.52 bits/s, error rate = 31%), and good results in the ISO 9241-411 independent rating scale. The results from the performed experiments show that the proposed system leads great promise in real-world applications. A low-cost head-mounted camera could be used as an alternative human interface device for people with limited hand/arm movement and control, allowing them to participate in the ongoing trend of computing devices gaining importance in our everyday activities.

[1]  Howell O. Istance,et al.  Why are eye mice unpopular? A detailed comparison of head and eye controlled assistive technology pointing devices , 2003, Universal Access in the Information Society.

[2]  Carlos Hitoshi Morimoto,et al.  Eye gaze tracking techniques for interactive applications , 2005, Comput. Vis. Image Underst..

[3]  Stuart K. Card,et al.  Evaluation of mouse, rate-controlled isometric joystick, step keys, and text keys, for text selection on a CRT , 1987 .

[4]  Edwin Peter Walsh,et al.  Human-computer interface using a head mounted camera and IR markers , 2015, AAATE Conf..

[5]  Robert J. K. Jacob,et al.  Eye tracking in advanced interface design , 1995 .

[6]  Bernhard P. Wrobel,et al.  Multiple View Geometry in Computer Vision , 2001 .

[7]  Mohan M. Trivedi,et al.  Head Pose Estimation in Computer Vision: A Survey , 2009, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[8]  Barry H. Kantowitz,et al.  Human Factors: Understanding People-System Relationships , 1983 .

[9]  I. Scott MacKenzie,et al.  Evaluating Eye Tracking Systems for Computer Input , 2012 .

[10]  Farzin Deravi,et al.  Evaluation of vision-based head-trackers for assistive devices , 2012, 2012 Annual International Conference of the IEEE Engineering in Medicine and Biology Society.

[11]  Paul Lukowicz,et al.  On the tip of my tongue: a non-invasive pressure-based tongue interface , 2014, AH.

[12]  Jarmo Verho,et al.  A Wearable, Wireless Gaze Tracker with Integrated Selection Command Source for Human‐Computer Interaction , 2011, IEEE Transactions on Information Technology in Biomedicine.

[13]  Jun Lin,et al.  Detection of Movements of Head and Mouth to Provide Computer Access for Disabled , 2011, 2011 International Conference on Technologies and Applications of Artificial Intelligence.

[14]  Brendan Z. Allison,et al.  P300 brain computer interface: current challenges and emerging trends , 2012, Front. Neuroeng..

[15]  Walter Daems,et al.  An optical head-pose tracking sensor for pointing devices using IR-LED based markers and a low-cost camera , 2015, 2015 IEEE SENSORS.

[16]  Volodymyr V. Kindratenko,et al.  A survey of electromagnetic position tracker calibration techniques , 2005, Virtual Reality.

[17]  Miad Faezipour,et al.  Eye Tracking and Head Movement Detection: A State-of-Art Survey , 2013, IEEE Journal of Translational Engineering in Health and Medicine.

[18]  Akira Sasou,et al.  Acoustic head orientation estimation applied to powered wheelchair control , 2009, 2009 Second International Conference on Robot Communication and Coordination.

[19]  Simon Meers,et al.  Simple, robust and accurate head-pose tracking using a single camera , 2008 .

[20]  Masakiyo Fujimoto,et al.  A realtime multimodal system for analyzing group meetings by combining face pose tracking and speaker diarization , 2008, ICMI '08.

[21]  Daniel Vogel,et al.  Clutching Is Not (Necessarily) the Enemy , 2015, CHI.

[22]  Jacob O. Wobbrock,et al.  In the shadow of misperception: assistive technology use and social interactions , 2011, CHI.

[23]  Shumin Zhai,et al.  Human Performance in Six Degree of Freedom Input Control , 2002 .

[24]  Doug A. Bowman,et al.  Virtual Reality: How Much Immersion Is Enough? , 2007, Computer.

[25]  A. Kaplan,et al.  Users of the world, unite! The challenges and opportunities of Social Media , 2010 .

[26]  Kjersti Vik,et al.  The ups and downs of social participation: experiences of wheelchair users in Norway , 2011, Disability and rehabilitation.

[27]  Robert J. Teather,et al.  FittsTilt: the application of Fitts' law to tilt-based interaction , 2012, NordiCHI.