Exploratory Analysis of Nose-gesture for Smartphone Aided Typing for Users with Clinical Conditions

Clinical disabilities might cause difficulty in typing using the traditional text entry systems that involve steady and precise finger movement within a restricted keyboard area in smartphones. Alternative approaches like voice-typing do not work well in a noisy environment, while gaze-typing needs large displays. In this paper, we develop an innovative approach called Nosype for hands-free typing over smartphones by utilizing nose gestures. With 10 users in a lab-scale, we observe that Nosype can help in hands-free typing at a typing-speed of 6.5 words/minute. Additionally, a usability study with 60 participants, including 10 participants having clinical disabilities, indicates an average usability score of 77.708.

[1]  I. Scott MacKenzie,et al.  BlinkWrite2: an improved text entry method using eye blinks , 2010, ETRA '10.

[2]  John Paulin Hansen,et al.  Gaze typing compared with input by head and hand , 2004, ETRA.

[3]  Paul A. Viola,et al.  Rapid object detection using a boosted cascade of simple features , 2001, Proceedings of the 2001 IEEE Computer Society Conference on Computer Vision and Pattern Recognition. CVPR 2001.

[4]  Faysal Ahmed,et al.  Controlling Multimedia Player With Eye Gaze Using Webcam , 2019, 2019 International Conference on Robotics,Electrical and Signal Processing Techniques (ICREST).

[5]  Meredith Ringel Morris,et al.  Improving Dwell-Based Gaze Typing with Dynamic, Cascading Dwell Times , 2017, CHI.

[6]  Venkata N. Padmanabhan,et al.  ALT: towards automating driver license testing using smartphones , 2019, SenSys.

[7]  J. B. Brooke,et al.  SUS: A 'Quick and Dirty' Usability Scale , 1996 .

[8]  I. Scott MacKenzie,et al.  Phrase sets for evaluating text entry techniques , 2003, CHI Extended Abstracts.

[9]  Gregory Cohen,et al.  EMNIST: Extending MNIST to handwritten letters , 2017, 2017 International Joint Conference on Neural Networks (IJCNN).