Affect-Aware Intelligent Environment Using Musical Cues as an Emotion Learning Framework

This position paper posits the use of an individual's affective response to musical cues as a means of designing an implicit communication channel between the user and their immediate computing infrastructure in the form of an intelligent environment. Interaction design for a sensor-rich intelligent environment is a challenging problem, often arising from the dynamic nature of such pervasive systems having no fixed set of interaction devices or users. However, the knowledge of a user's affective responses to known musical cues may provide a learning framework for inferring affective states such as stress or frustration. This, in turn, may be used by the intelligent environment to assess the user's (dis)approval of the services it provides, helping it to refine its services to better suit the user's immediate needs.

[1]  Wei Huang,et al.  Exploring the causal relationships between musical features and physiological indicators of emotion , 2015, 2015 International Conference on Affective Computing and Intelligent Interaction (ACII).

[2]  Gerhard Tröster,et al.  Discriminating Stress From Cognitive Load Using a Wearable EDA Device , 2010, IEEE Transactions on Information Technology in Biomedicine.

[3]  Anind K. Dey,et al.  Understanding and Using Context , 2001, Personal and Ubiquitous Computing.

[4]  M. Weiser,et al.  Hot topics-ubiquitous computing , 1993 .

[5]  Mann Oo. Hay Emotion recognition in human-computer interaction , 2012 .

[6]  R. Benjamin Knapp,et al.  Towards incorporating affective feedback into context-aware intelligent environments , 2015, 2015 International Conference on Affective Computing and Intelligent Interaction (ACII).

[7]  Jonathan J. Cadiz,et al.  Interaction Issues in Context-Aware Intelligent Environments , 2001, Hum. Comput. Interact..

[8]  A. Mehrabian Silent Messages: Implicit Communication of Emotions and Attitudes , 1971 .

[9]  E. Schellenberg,et al.  Effects of Musical Tempo and Mode on Arousal, Mood, and Spatial Abilities , 2002 .

[10]  Elisabeth André,et al.  Emotion recognition based on physiological changes in music listening , 2008, IEEE Transactions on Pattern Analysis and Machine Intelligence.

[11]  Víctor Zamudio,et al.  Introduction to the thematic issue on Affect Aware Ubiquitious Computing , 2015, J. Ambient Intell. Smart Environ..

[12]  A. Barreto,et al.  Stress Detection in Computer Users Based on Digital Signal Processing of Noninvasive Physiological Variables , 2006, 2006 International Conference of the IEEE Engineering in Medicine and Biology Society.

[13]  Michael Koch,et al.  Ubiquitous Computing , 2001, CSCW-Kompendium.

[14]  P. Juslin,et al.  Emotional responses to music: experience, expression, and physiology , 2009 .

[15]  R. Zatorre,et al.  Intensely pleasurable responses to music correlate with activity in brain regions implicated in reward and emotion , 2001, Proceedings of the National Academy of Sciences of the United States of America.

[16]  Mary Czerwinski,et al.  Under pressure: sensing stress of computer users , 2014, CHI.

[17]  Sylvia D. Kreibig,et al.  Autonomic nervous system activity in emotion: A review , 2010, Biological Psychology.

[18]  N. Rickard Intense emotional responses to music: a test of the physiological arousal hypothesis , 2004 .

[19]  Emre Ertin,et al.  Continuous inference of psychological stress from sensory measurements collected in the natural environment , 2011, Proceedings of the 10th ACM/IEEE International Conference on Information Processing in Sensor Networks.

[20]  Minh Hoai Nguyen,et al.  Personalized Stress Detection from Physiological Measurements , 2010 .

[21]  C. Krumhansl An exploratory study of musical emotions and psychophysiology. , 1997, Canadian journal of experimental psychology = Revue canadienne de psychologie experimentale.

[22]  René Riedl,et al.  Computer Breakdown as a Stress Factor during Task Completion under Time Pressure: Identifying Gender Differences Based on Skin Conductance , 2013, Adv. Hum. Comput. Interact..

[23]  Nik Thompson,et al.  Affective Human Computer Interaction , 2015 .