Human-Computer Interaction Task Classification via Visual-Based Input Modalities

Enhancing computers with the facility to perceive and recognise the user feelings and abilities, as well as aspects related to the task, becomes a key element for the creation of Intelligent Human-Computer Interaction. Many studies have focused on predicting users’ cognitive and affective states and other human factors, such as usability and user experience, to achieve high quality interaction. However, there is a need for another approach that will empower computers to perceive more about the task that is being conducted by the users. This paper presents a study that explores user-driven task-based classification, whereby the classification algorithm used features from visual-based input modalities, i.e. facial expression via webcam, and eye gaze behaviour via eye-tracker. Within the experiments presented herein, the dataset employed by the model comprises four different computer-based tasks. Correspondingly, using a Support Vector Machine-based classifier, the average classification accuracy achieved across 42 subjects is 85.52% when utilising facial-based features as an input feature vector, and an average accuracy of 49.65% when using eye gaze-based features. Furthermore, using a combination of both types of features achieved an average classification accuracy of 87.63%.

[1]  Cristina Conati,et al.  Inferring Visualization Task Properties, User Performance, and User Cognitive Abilities from Eye Gaze Data , 2014, ACM Trans. Interact. Intell. Syst..

[2]  Dan Saffer,et al.  Designing for Interaction: Creating Innovative Applications and Devices , 2009 .

[3]  Stefanos Zafeiriou,et al.  Incremental Face Alignment in the Wild , 2014, 2014 IEEE Conference on Computer Vision and Pattern Recognition.

[4]  L. Balint,et al.  Adaptive interfaces for human-computer interaction: a colorful spectrum of present and future options , 1995, 1995 IEEE International Conference on Systems, Man and Cybernetics. Intelligent Systems for the 21st Century.

[5]  Michael Burch,et al.  State-of-the-Art of Visualization for Eye Tracking Data , 2014, EuroVis.

[6]  Brian P. Bailey,et al.  Categories & Subject Descriptors: H.5.2 [Information , 2022 .

[7]  Hui Wang,et al.  Sensing Affective States Using Facial Expression Analysis , 2016, UCAmI.

[8]  Brian P. Bailey,et al.  Using Eye Gaze Patterns to Identify User Tasks , 2004 .

[9]  Enzo Pasquale Scilingo,et al.  Eye gaze patterns in emotional pictures , 2012, Journal of Ambient Intelligence and Humanized Computing.

[10]  Milad Alemzadeh,et al.  Human-Computer Interaction: Overview on State of the Art , 2008 .

[11]  Linden J. Ball,et al.  Eye tracking in HCI and usability research. , 2006 .

[12]  Thomas Beauvisage,et al.  Computer usage in daily life , 2009, CHI.

[13]  Chih-Jen Lin,et al.  LIBSVM: A library for support vector machines , 2011, TIST.

[14]  Linden J. Ball,et al.  Eye Tracking in Human-Computer Interaction and Usability Research : Current Status and Future Prospects , 2004 .