Affective computing study of attention recognition for the 3D guide system

The eye-tracking has been widely used in multiple discipline studies in recent years. However, most of the studies focused on the analysis for static images or text but less for highly interactive operation application. In addition, the affective computing rose and development in recent years have changed completely the design of thinking pattern for the human–computer interaction. Therefore, this study hopes to integrate the affective computing into the 3D guide system which was developed for the real campus through the eye-tracking technology. The analysing user's gaze position and recognising attention emotion are according to the interest region which is stetted into the environment, and shows the feedback content corresponds to the area and the emotion. Through the most intuitive gaze analysis, the operation burden can be reduced and the user's interactive experience can be improved to achieve intuitive and user-friendly experience. The results can also apply for medical therapy on human attention training.

[1]  Eman M. G. Younis,et al.  Towards unravelling the relationship between on-body, environmental and emotion data using sensor information fusion approach , 2018, Inf. Fusion.

[2]  Ulf-Dietrich Reips,et al.  Interval-level measurement with visual analogue scales in Internet-based research: VAS Generator , 2008, Behavior research methods.

[3]  Tanya J. McGill,et al.  Affective Stack - A Model for Affective Computing Application Development , 2015, J. Softw..

[4]  D. E. Irwin Memory for position and identity across eye movements. , 1992 .

[5]  Yuping Chen,et al.  Eye-hand coordination strategies during active video game playing: An eye-tracking study , 2015, Comput. Hum. Behav..

[6]  Petra Bosch-Sijtsema,et al.  Interactive navigation interface for Virtual Reality using the human body , 2014, Comput. Environ. Urban Syst..

[7]  CambriaErik,et al.  A review of affective computing , 2017 .

[8]  Charles Spence,et al.  Using combined eye tracking and word association in order to assess novel packaging solutions: A case study involving jam jars , 2013 .

[9]  M. Hayhoe,et al.  In what ways do eye movements contribute to everyday activities? , 2001, Vision Research.

[10]  Han-Seok Seo,et al.  Visual attention toward food-item images can vary as a function of background saliency and culture: An eye-tracking study , 2015 .

[11]  Christine L. Lisetti Affective computing , 1998, Pattern Analysis and Applications.

[12]  Gunay Sadikoglu Modeling of Consumer Buying Behaviour Using Z-Number Concept , 2018 .

[13]  Tao Liu,et al.  Building ontology for different emotional contexts and multilingual environment in opinion mining , 2018 .

[14]  Bir Bhanu,et al.  Understanding Discrete Facial Expressions in Video Using an Emotion Avatar Image , 2012, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[15]  Jihun Cha,et al.  Multi-modal user interaction method based on gaze tracking and gesture recognition , 2013, Signal Process. Image Commun..

[16]  Giancarlo Fortino,et al.  A framework for collaborative computing and multi-sensor data fusion in body sensor networks , 2015, Inf. Fusion.

[17]  Erik Cambria,et al.  A review of affective computing: From unimodal analysis to multimodal fusion , 2017, Inf. Fusion.

[18]  Andrew Olney,et al.  Gaze tutor: A gaze-reactive intelligent tutoring system , 2012, Int. J. Hum. Comput. Stud..

[19]  Sanford Labovitz,et al.  Some Observations on Measurement and Statistics , 1967 .

[20]  Rama Chellappa,et al.  Structure-Preserving Sparse Decomposition for Facial Expression Analysis , 2014, IEEE Transactions on Image Processing.