Towards Analyzing and Predicting the Experience of Live Performances with Wearable Sensing

We present an approach to interpret the response of audiences to live performances by processing mobile sensor data. We apply our method on three different datasets obtained from three live performances, where each audience member wore a single tri-axial accelerometer and proximity sensor embedded inside a smart sensor pack. Using these sensor data, we developed a novel approach to predict audience members' self-reported experience of the performances in terms of enjoyment, immersion, willingness to recommend the event to others and change in mood. The proposed method uses an unsupervised method to identify informative intervals of the event, using the linkage of the audience members' bodily movements, and uses data from these intervals only to estimate the audience members' experience. We also analyze how the relative location of members of the audience can affect their experience and present an automatic way of recovering neighborhood information based on proximity sensors. We further show that the linkage of the audience members' bodily movements is informative of memorable moments which were later reported by the audience.

[1]  Michelle Karg,et al.  Recognition of Affect Based on Gait Patterns , 2010, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[2]  Chih-Jen Lin,et al.  LIBSVM: A library for support vector machines , 2011, TIST.

[3]  Philip C. Rothschild Social media use in sports and entertainment venues , 2011 .

[4]  Tong Zhang,et al.  Fall Detection by Embedding an Accelerometer in Cellphone and Using KFD Algorithm , 2006 .

[5]  Zhihong Zeng,et al.  A Survey of Affect Recognition Methods: Audio, Visual, and Spontaneous Expressions , 2007, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[6]  Lucy Bennett,et al.  Patterns of listening through social media: online fan engagement with the live music experience , 2011 .

[7]  Kathryn Graff Low,et al.  Effects of dance/movement therapy: A meta-analysis , 1996 .

[8]  Ginevra Castellano,et al.  Recognising Human Emotions from Body Movement and Gesture Dynamics , 2007, ACII.

[9]  Peter Robinson,et al.  Detecting Affect from Non-stylised Body Motions , 2007, ACII.

[10]  Alex C. Michalos,et al.  Arts and the Perceived Quality of Life in British Columbia , 2010 .

[11]  Savvas Papagiannidis,et al.  User experience on mobile video appreciation: How to engross users and to enhance their enjoyment in watching mobile video clips , 2012 .

[12]  Michelle Karg,et al.  Body Movements for Affective Expression: A Survey of Automatic Recognition and Generation , 2013, IEEE Transactions on Affective Computing.

[13]  Chen Wang,et al.  Sensing a live audience , 2014, CHI.

[14]  D. E. Glaser,et al.  Towards a sensorimotor aesthetics of performing art , 2008, Consciousness and Cognition.

[15]  Jie Li,et al.  Crowd textures as proximity graphs , 2014, IEEE Communications Magazine.

[16]  Simon Hudson,et al.  Engaging with consumers using social media: a case study of music festivals , 2013 .

[17]  Andrea Kleinsmith,et al.  Recognizing Affective Dimensions from Body Posture , 2007, ACII.

[18]  Ciro Cattuto,et al.  Dynamics of Person-to-Person Interactions from Distributed RFID Sensor Networks , 2010, PloS one.

[19]  Fernando De la Torre,et al.  Facial Expression Analysis , 2011, Visual Analysis of Humans.

[20]  Ashish Kapoor,et al.  Multimodal affect recognition in learning environments , 2005, ACM Multimedia.

[21]  Gwenn Englebienne,et al.  Mining for motivation: using a single wearable accelerometer to detect people's interests , 2012, IMMPD '12.

[22]  Emily S. Cross,et al.  Neurocognitive control in dance perception and performance. , 2012, Acta psychologica.

[23]  Joachim M. Buhmann,et al.  The Balanced Accuracy and Its Posterior Distribution , 2010, 2010 20th International Conference on Pattern Recognition.

[24]  G. Carnaby,et al.  The Impact of Music Interventions on Anxiety for Adult Cancer Patients , 2013, Integrative cancer therapies.

[25]  Wilhelm Burger,et al.  Digital Image Processing - An Algorithmic Introduction using Java , 2008, Texts in Computer Science.

[26]  Julien Fleureau,et al.  Affective Benchmarking of Movies Based on the Physiological Responses of a Real Audience , 2013, 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction.

[27]  Mohammad Soleymani,et al.  Highlight Detection in Movie Scenes Through Inter-users, Physiological Linkage , 2013, Social Media Retrieval.

[28]  W. Whyte The social life of small urban spaces , 1980 .

[29]  Celine Latulipe,et al.  Love, hate, arousal and engagement: exploring audience responses to performing arts , 2011, CHI.

[30]  Holger Regenbrecht,et al.  The Experience of Presence: Factor Analytic Insights , 2001, Presence: Teleoperators & Virtual Environments.

[31]  Nicu Sebe,et al.  Authentic facial expression analysis , 2004, Sixth IEEE International Conference on Automatic Face and Gesture Recognition, 2004. Proceedings..

[32]  Mohammad Soleymani,et al.  A Multimodal Database for Affect Recognition and Implicit Tagging , 2012, IEEE Transactions on Affective Computing.

[33]  Matthew Reason,et al.  Kinesthesia, Empathy, and Related Pleasures: An Inquiry into Audience Experiences of Watching Dance , 2010, Dance Research Journal.

[34]  Mohammad Soleymani,et al.  Human-centered implicit tagging: Overview and perspectives , 2012, 2012 IEEE International Conference on Systems, Man, and Cybernetics (SMC).

[35]  Gary M. Weiss,et al.  Activity recognition using cell phone accelerometers , 2011, SKDD.

[36]  Dirk Helbing,et al.  Recognition of crowd behavior from mobile sensors with pattern analysis and graph clustering methods , 2011, Networks Heterog. Media.

[37]  Anthony Steed,et al.  Automatic Recognition of Non-Acted Affective Postures , 2011, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[38]  Alan S. Brown,et al.  Assessing the intrinsic impacts of a live performance , 2008 .

[39]  Emily S. Cross,et al.  The Impact of Aesthetic Evaluation and Physical Ability on Dance Perception , 2011, Front. Hum. Neurosci..

[40]  Andrea Kleinsmith,et al.  Affective Body Expression Perception and Recognition: A Survey , 2013, IEEE Transactions on Affective Computing.

[41]  Tellervo Nenonen,et al.  Cultural services and activities: The association with self-rated health and quality of life , 2014 .

[42]  Romit Roy Choudhury,et al.  Your reactions suggest you liked the movie: automatic content rating via reaction sensing , 2013, UbiComp.

[43]  Arthur C. Graesser,et al.  AUTOMATIC DETECTION OF LEARNER'S AFFECT FROM GROSS BODY LANGUAGE , 2009, Appl. Artif. Intell..

[44]  Takeo Kanade,et al.  Comprehensive database for facial expression analysis , 2000, Proceedings Fourth IEEE International Conference on Automatic Face and Gesture Recognition (Cat. No. PR00580).

[45]  Ling Bao,et al.  Activity Recognition from User-Annotated Acceleration Data , 2004, Pervasive.

[46]  Nicu Sebe,et al.  Facial expression recognition from video sequences , 2002, Proceedings. IEEE International Conference on Multimedia and Expo.

[47]  K. Scherer,et al.  Bodily expression of emotion , 2009 .

[48]  Maja Pantic,et al.  Automatic Analysis of Facial Expressions: The State of the Art , 2000, IEEE Trans. Pattern Anal. Mach. Intell..

[49]  Catherine J. Stevens,et al.  Direct and indirect methods for measuring audience reactions to contemporary dance , 2009 .

[50]  Hatice Gunes,et al.  Automatic Temporal Segment Detection and Affect Recognition From Face and Body Display , 2009, IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics).

[51]  Elaine Toms,et al.  The development and evaluation of a survey to measure user engagement , 2010, J. Assoc. Inf. Sci. Technol..

[52]  Ilias Maglogiannis,et al.  Patient Fall Detection using Support Vector Machines , 2007, AIAI.