Human-Computer Interaction Task Classification via Visual-Based Input Modalities
暂无分享,去创建一个
[1] Cristina Conati,et al. Inferring Visualization Task Properties, User Performance, and User Cognitive Abilities from Eye Gaze Data , 2014, ACM Trans. Interact. Intell. Syst..
[2] Dan Saffer,et al. Designing for Interaction: Creating Innovative Applications and Devices , 2009 .
[3] Stefanos Zafeiriou,et al. Incremental Face Alignment in the Wild , 2014, 2014 IEEE Conference on Computer Vision and Pattern Recognition.
[4] L. Balint,et al. Adaptive interfaces for human-computer interaction: a colorful spectrum of present and future options , 1995, 1995 IEEE International Conference on Systems, Man and Cybernetics. Intelligent Systems for the 21st Century.
[5] Michael Burch,et al. State-of-the-Art of Visualization for Eye Tracking Data , 2014, EuroVis.
[6] Brian P. Bailey,et al. Categories & Subject Descriptors: H.5.2 [Information , 2022 .
[7] Hui Wang,et al. Sensing Affective States Using Facial Expression Analysis , 2016, UCAmI.
[8] Brian P. Bailey,et al. Using Eye Gaze Patterns to Identify User Tasks , 2004 .
[9] Enzo Pasquale Scilingo,et al. Eye gaze patterns in emotional pictures , 2012, Journal of Ambient Intelligence and Humanized Computing.
[10] Milad Alemzadeh,et al. Human-Computer Interaction: Overview on State of the Art , 2008 .
[11] Linden J. Ball,et al. Eye tracking in HCI and usability research. , 2006 .
[12] Thomas Beauvisage,et al. Computer usage in daily life , 2009, CHI.
[13] Chih-Jen Lin,et al. LIBSVM: A library for support vector machines , 2011, TIST.
[14] Linden J. Ball,et al. Eye Tracking in Human-Computer Interaction and Usability Research : Current Status and Future Prospects , 2004 .