Exploring Audience Response in Performing Arts with a Brain-Adaptive Digital Performance System

Audience response is an important indicator of the quality of performing arts. Psychophysiological measurements enable researchers to perceive and understand audience response by collecting their bio-signals during a live performance. However, how the audience respond and how the performance is affected by these responses are the key elements but are hard to implement. To address this issue, we designed a brain-computer interactive system called Brain-Adaptive Digital Performance (BADP) for the measurement and analysis of audience engagement level through an interactive three-dimensional virtual theater. The BADP system monitors audience engagement in real time using electroencephalography (EEG) measurement and tries to improve it by applying content-related performing cues when the engagement level decreased. In this article, we generate EEG-based engagement level and build thresholds to determine the decrease and re-engage moments. In the experiment, we simulated two types of theatre performance to provide participants a high-fidelity virtual environment using the BADP system. We also create content-related performing cues for each performance under three different conditions. The results of these evaluations show that our algorithm could accurately detect the engagement status and the performing cues have a positive impact on regaining audience engagement across different performance types. Our findings open new perspectives in audience-based theatre performance design.

[1]  P. Lang The emotion probe. Studies of motivation and attention. , 1995, The American psychologist.

[2]  Pia Tikka,et al.  Narrative logic of enactive cinema: Obsession , 2006, Digit. Creativity.

[3]  Anton Nijholt,et al.  Emotional brain-computer interfaces , 2009, 2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops.

[4]  Steve Benford,et al.  Performing Mixed Reality , 2011 .

[5]  Marc Cavazza,et al.  Affective Interaction with a Virtual Character Through an fNIRS Brain-Computer Interface , 2016, Front. Comput. Neurosci..

[6]  Hongsong Li,et al.  Enhancing Audience Engagement in Performing Arts Through an Adaptive Virtual Environment with a Brain-Computer Interface , 2016, IUI.

[7]  Christa Williford A Computer Reconstruction of Richelieu's Palais Cardinal Theatre, 1641 , 2000, Theatre Research International.

[8]  Edward Cutrell,et al.  BCI for passive input in HCI , 2007 .

[9]  Juan E. Gilbert,et al.  Let's learn!: enhancing user's engagement levels through passive brain-computer interfaces , 2013, CHI Extended Abstracts.

[10]  Pasin Israsena,et al.  Real-Time EEG-Based Happiness Detection System , 2013, TheScientificWorldJournal.

[11]  Jie Liu,et al.  FOCUS: enhancing children's engagement in reading by using contextual BCI training sessions , 2014, CHI.

[12]  Christian Kothe,et al.  Towards passive brain–computer interfaces: applying brain–computer interface technology to human–machine systems in general , 2011, Journal of neural engineering.

[13]  W. Boucsein Electrodermal activity, 2nd ed. , 2012 .

[14]  Michelle N. Lumicao,et al.  EEG correlates of task engagement and mental workload in vigilance, learning, and memory tasks. , 2007, Aviation, space, and environmental medicine.

[15]  M. Teplan FUNDAMENTALS OF EEG MEASUREMENT , 2002 .

[16]  José del R. Millán,et al.  You Are Wrong! - Automatic Detection of Interaction Errors from Brain Waves , 2005, IJCAI.

[17]  H. Jasper,et al.  The ten-twenty electrode system of the International Federation. The International Federation of Clinical Neurophysiology. , 1999, Electroencephalography and clinical neurophysiology. Supplement.

[18]  Celine Latulipe,et al.  Love, hate, arousal and engagement: exploring audience responses to performing arts , 2011, CHI.

[19]  Stuart Reeves,et al.  Designing Interfaces in Public Settings - Understanding the Role of the Spectator in Human-Computer Interaction , 2011, Human-Computer Interaction Series.

[20]  Desney S. Tan,et al.  Brain-Computer Interfaces: Applying our Minds to Human-Computer Interaction , 2010 .

[21]  Kathryn Keeble,et al.  Mark Oliphant’s adventures in atomic wonderland , 2010 .

[22]  Bilge Mutlu,et al.  Pay attention!: designing adaptive agents that monitor and improve user engagement , 2012, CHI.

[23]  Desney S. Tan,et al.  Using a low-cost electroencephalograph for task classification in HCI research , 2006, UIST.

[24]  J. Lagopoulos Electrodermal activity , 2007, Acta Neuropsychiatrica.

[25]  Peter Dalsgård,et al.  Performing perception—staging aesthetics of interaction , 2008, TCHI.

[26]  Inma Álvarez,et al.  Expression in the Performing Arts , 2010 .

[27]  C. Rosenberg,et al.  Electroencephalography: Basic Principles, Clinical Applications, and Related Fields, 3rd Ed. , 1994 .

[28]  Subramanian Ramanathan,et al.  DECAF: MEG-Based Multimodal Database for Decoding Affective Physiological Responses , 2015, IEEE Transactions on Affective Computing.

[29]  D. E. Glaser,et al.  Towards a sensorimotor aesthetics of performing art , 2008, Consciousness and Cognition.

[30]  Anton Nijholt,et al.  Turning Shortcomings into Challenges: Brain-Computer Interfaces for Games , 2009, INTETAIN.

[31]  Marc Cavazza,et al.  Exploring passive user interaction for adaptive narratives , 2012, IUI '12.

[32]  Steve Benford,et al.  #Scanners: Exploring the Control of Adaptive Films using Brain-Computer Interaction , 2016, CHI.

[33]  Chiara Rossitto,et al.  Acting with Technology: Rehearsing for Mixed-Media Live Performances , 2016, CHI.

[34]  Adrian Hilton,et al.  A survey of advances in vision-based human motion capture and analysis , 2006, Comput. Vis. Image Underst..

[35]  Tapio Takala,et al.  Enactive Systems and Enactive Media: Embodied Human-Machine Coupling beyond Interfaces , 2011, Leonardo.

[36]  Hilary Glow,et al.  Hidden stories : listening to the audience at the live performance , 2010 .

[37]  Marc Cavazza,et al.  A Brain-Computer Interface to a Plan-Based Narrative , 2013, IJCAI.

[38]  Thierry Pun,et al.  DEAP: A Database for Emotion Analysis ;Using Physiological Signals , 2012, IEEE Transactions on Affective Computing.

[39]  Chen Wang,et al.  Sensing a live audience , 2014, CHI.

[40]  R. Rowden Performing arts. , 1988, Nursing times.

[41]  Giulio Jacucci Interaction As Performance , 2015 .

[42]  Stuart Reeves,et al.  A Framework for Designing Interfaces in Public Settings , 2011 .

[43]  Rob Napoli,et al.  Scenic Design and Lighting Techniques: A Basic Guide for Theatre , 2006 .

[44]  Giulio Jacucci,et al.  Interaction as Performance: Performative Strategies in Designing Interactive Experiences , 2015 .

[45]  Marc Cavazza,et al.  Anger-based BCI Using fNIRS Neurofeedback , 2015, UIST.

[46]  Jennifer Radbourne,et al.  Audience experience : measuring quality in the performing arts , 2009 .

[47]  Giulio Jacucci,et al.  Physiological Computing , 2015, Computer.

[48]  Markus Funk,et al.  Implicit Engagement Detection for Interactive Museums Using Brain-Computer Interfaces , 2015, MobileHCI Adjunct.

[49]  Ann Morrison,et al.  Bodily Explorations in Space: Social Experience of a Multimodal Art Installation , 2009, INTERACT.

[50]  Orkan Telhan,et al.  Indexical Visualization—the Data-less Information Display , 2015 .

[51]  Tapio Takala,et al.  Enactive cinema paves way for understanding complex real-time social interaction in neuroimaging experiments , 2012, Front. Hum. Neurosci..

[52]  Anatole Lécuyer,et al.  An overview of research on "passive" brain-computer interfaces for implicit human-computer interaction , 2010 .

[53]  A. Pope,et al.  Biocybernetic system evaluates indices of operator engagement in automated task , 1995, Biological Psychology.

[54]  Steve Dixon,et al.  Digital Performance: A History of New Media in Theater, Dance, Performance Art, and Installation , 2007 .