Analysis of Tourists' Nationality Effects on Behavior-based Emotion and Satisfaction Estimation

Smart tourism is attracting attention of researchers in recent years. Its technologies can be used by tourists in order to obtain useful information during sightseeing with smart devices etc. To provide suitable and personalized tourism information according to the situation of tourists, understanding psychological status during sightseeing, especially, emotional status and satisfaction level, is important. We assume that the psychological status of tourists is appearing and represented through unconscious behaviors during sightseeing such as head/body movements and facial/vocal expressions, and have proposed methods to estimate emotion and satisfaction statuses by sensing and analyzing tourists' behaviors. Through in-the-wild experiments with 22 participants, we found that the difference in tourists' attributes might give effects for the estimation. In this paper, we have statistically analyzed those effects, focusing on tourists' nationality. As a result of the two-way ANOVA, we found the interaction effect (disordinal interaction) between tourists' nationality and estimation performance, the main effect in differences of features, and the main effect in differences of tourists' nationality. The results imply that we need to take tourists' nationality into account for building estimation models. Contribution: We have statistically analyzed the nationality effects on tourist emotion and satisfaction estimation, and confirmed significant differences in feature contributions for estimation models.

[1]  J. Russell A circumplex model of affect. , 1980 .

[2]  S. Gosling,et al.  A very brief measure of the Big-Five personality domains , 2003 .

[3]  P. Ekman,et al.  What the face reveals : basic and applied studies of spontaneous expression using the facial action coding system (FACS) , 2005 .

[4]  Björn Schuller,et al.  Opensmile: the munich versatile and fast open-source audio feature extractor , 2010, ACM Multimedia.

[5]  D. Isaacowitz,et al.  Cultural differences in gaze and emotion recognition: Americans contrast more than Chinese. , 2013, Emotion.

[6]  Andrea Kleinsmith,et al.  Affective Body Expression Perception and Recognition: A Survey , 2013, IEEE Transactions on Affective Computing.

[7]  Peter Zeile,et al.  Urban Emotions - Geo-Semantic Emotion Extraction from Technical Sensors, Human Sensors and Crowdsourced Data , 2014, LBS.

[8]  Andreas Bulling,et al.  Pupil: an open source platform for pervasive eye tracking and mobile gaze-based interaction , 2014, UbiComp Adjunct.

[9]  D. Jani Relating travel personality to Big Five Factors of personality. , 2014 .

[10]  Yevgeni Koucheryavy,et al.  IoT Use Cases in Healthcare and Tourism , 2015, 2015 IEEE 17th Conference on Business Informatics.

[11]  Keiichi Yasumoto,et al.  SakuraSensor: quasi-realtime cherry-lined roads detection through participatory video sensing by cars , 2015, UbiComp.

[12]  Stefan Ultes,et al.  Adaptive dialogue management in the KRISTINA project for multicultural health care applications , 2015 .

[13]  Mohammad Soleymani,et al.  Analysis of EEG Signals and Facial Expressions for Continuous Emotion Detection , 2016, IEEE Transactions on Affective Computing.

[14]  Anurag Mittal,et al.  Bi-modal First Impressions Recognition Using Temporally Ordered Deep Audio and Stochastic Visual Features , 2016, ECCV Workshops.

[15]  Haizhou Li,et al.  Mobile acoustic Emotion Recognition , 2016, 2016 IEEE Region 10 Conference (TENCON).

[16]  Tingshao Zhu,et al.  Emotion recognition based on customized smart bracelet with built-in accelerometer , 2016, PeerJ.

[17]  Peter Robinson,et al.  OpenFace: An open source facial behavior analysis toolkit , 2016, 2016 IEEE Winter Conference on Applications of Computer Vision (WACV).

[18]  Mohammad Mahdi Ghassemi,et al.  Predicting Latent Narrative Mood Using Audio and Physiologic Data , 2017, AAAI.

[19]  Yutaka Arakawa,et al.  SenStick: Comprehensive Sensing Platform with an Ultra Tiny All-In-One Sensor Board for IoT Research , 2017, J. Sensors.

[20]  George Trigeorgis,et al.  End-to-End Multimodal Emotion Recognition Using Deep Neural Networks , 2017, IEEE Journal of Selected Topics in Signal Processing.

[21]  Jesse Hoey,et al.  From individual to group-level emotion recognition: EmotiW 5.0 , 2017, ICMI.

[22]  Zhu Wang,et al.  EmotionSense: Emotion Recognition Based on Wearable Wristband , 2018, 2018 IEEE SmartWorld, Ubiquitous Intelligence & Computing, Advanced & Trusted Computing, Scalable Computing & Communications, Cloud & Big Data Computing, Internet of People and Smart City Innovation (SmartWorld/SCALCOM/UIC/ATC/CBDCom/IOP/SCI).

[23]  Yutaka Arakawa,et al.  EmoTour: Estimating Emotion and Satisfaction of Users Based on Behavioral Cues and Audiovisual Data , 2018, Sensors.

[24]  Eman M. G. Younis,et al.  Towards unravelling the relationship between on-body, environmental and emotion data using sensor information fusion approach , 2018, Inf. Fusion.

[25]  Yuanxi Li,et al.  Beyond Big Data of Human Behaviors: Modeling Human Behaviors and Deep Emotions , 2018, 2018 IEEE Conference on Multimedia Information Processing and Retrieval (MIPR).

[26]  Juliana Miehle,et al.  What Causes the Differences in Communication Styles? A Multicultural Study on Directness and Elaborateness , 2018, LREC.

[27]  Chen Wang,et al.  A Review, Current Challenges, and Future Possibilities on Emotion Recognition Using Machine Learning and Physiological Signals , 2019, IEEE Access.

[28]  Yutaka Arakawa,et al.  Towards Real-Time Contextual Touristic Emotion and Satisfaction Estimation with Wearable Devices , 2019, 2019 IEEE International Conference on Pervasive Computing and Communications Workshops (PerCom Workshops).

[29]  Shivani Nagalkar,et al.  Emotion recognition using facial expressions , 2019 .

[30]  Yu-Liang Hsu,et al.  Automatic ECG-Based Emotion Recognition in Music Listening , 2020, IEEE Transactions on Affective Computing.