Visualizing the Emotional Journey of a Museum

Wearable devices and new types of sensors make it possible to capture people behavior, activity and, potentially, cognitive state in their daily life. Today those devices are mainly used for well-being applications, by recording and displaying people's activity. Some work have been published going a step further by inferring from the recorded signals the emotional state of individuals or group of people. However, the information provided and the way it is presented are still in their infancy, with time lined graphs showing calories, heart-rate, steps, temperature, and sometimes affective intensity. In this paper we present an experiment done during the visit of different people in a museum of arts to capture the emotional impact of the exposed paintings. We also propose an associated visualization of their emotional journey. The emotion is here measured as the affective response to the paintings observation, and the processing algorithm is based on an existing technique adapted to the particular case of different observation durations. The visualization is based on a 3D map of the museum with different colors associated to the different paintings to get the emotional heat-map of the museum (more precisely the arousal dimension). The validation has been done in the museum of arts at Lyon, France, with 46 visitors, for a total of 27 paintings, exposed in three different rooms.

[1]  P. Lang The emotion probe. Studies of motivation and attention. , 1995, The American psychologist.

[2]  Robert Li Kam Wa MoodScope: Building a Mood Sensor from Smartphone Usage Patterns , 2012 .

[3]  Julien Fleureau,et al.  Affective Benchmarking of Movies Based on the Physiological Responses of a Real Audience , 2013, 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction.

[4]  W. Tschacher,et al.  The Physiology of Phenomenology: The Effects of Artworks , 2012 .

[5]  Sriparna Saha,et al.  A study on emotion recognition from body gestures using Kinect sensor , 2014, 2014 International Conference on Communication and Signal Processing.

[6]  Qiang Ji,et al.  Video Affective Content Analysis: A Survey of State-of-the-Art Methods , 2015, IEEE Transactions on Affective Computing.

[7]  Maja Pantic,et al.  Automatic Analysis of Facial Expressions: The State of the Art , 2000, IEEE Trans. Pattern Anal. Mach. Intell..

[8]  W. Boucsein Electrodermal activity, 2nd ed. , 2012 .

[9]  C Nold Emotional Cartography: Technologies of the Self , 2009 .

[10]  David Amarantini,et al.  Opening the « black box » of electrodermal activity in consumer neuroscience research , 2012 .

[11]  Vladan Devedzic,et al.  Textual Affect Communication and Evocation Using Abstract Generative Visuals , 2016, IEEE Transactions on Human-Machine Systems.

[12]  Quan Huynh-Thu,et al.  Physiological-Based Affect Event Detector for Entertainment Video Applications , 2012, IEEE Transactions on Affective Computing.

[13]  Daniel McDuff,et al.  Predicting Ad Liking and Purchase Intent: Large-Scale Analysis of Facial Responses to Ads , 2014, IEEE Transactions on Affective Computing.

[14]  W. Tschacher,et al.  Physiological Correlates of Aesthetic Perception of Artworks in a Museum , 2012 .

[15]  M. Bradley,et al.  Measuring emotion: the Self-Assessment Manikin and the Semantic Differential. , 1994, Journal of behavior therapy and experimental psychiatry.

[16]  Thierry Pun,et al.  DEAP: A Database for Emotion Analysis ;Using Physiological Signals , 2012, IEEE Transactions on Affective Computing.

[17]  Thierry Pun,et al.  Multimodal Emotion Recognition in Response to Videos , 2012, IEEE Transactions on Affective Computing.