Affective Computing Experiments in Virtual Reality with Wearable Sensors. Methodological considerations and preliminary results

A ective computing (AfC) is a novel paradigm originally proposed in 1997 by Rosalind Picard from MIT Media Lab in her paramount book [12]. It builds on the results of biomedical engineering and psychology and aims at allowing computer systems to detect, use, and express emotions [4]. While at rst sight it may look general from the computer science point of view, in fact it is a constructive and practical approach oriented mainly at improving human-like decision support as well as human-computer interaction. AfC is a eld of study that puts interest in design and description of systems that are able to collect, interpret, process (ultimately simulate) emotional states (a ects). We assume that emotions are physical and cognitive [12, p.21] and as such they can be studied interdisciplinary by computer science, biomedical engineering and psychology. For a ective computing there are two crucial elements to be considered: modes of data collection and ways of interpreting them in correlation with a ective states corresponding to emotions. First is carried out by selection of methods for detecting information about emotions this means using various sensors which capture data about human physical states and behaviors. Today most often harvested and processed information are about: speech (prosody: pitch variables, speech rate), body gestures and poses (3D mapping, motion capture techniques), facial expressions (visual analysis and electromyography), physiological monitoring (blood pressure, blood volume pulse, galvanic skin response). In our work we plan to use a range of wearable physiological sensors, namely the Empatica E4. It is an advanced sensor based on the technologies previously developed in the A ective Computing division of MIT Media Lab . Moreover, it was used

[1]  Sylvia D. Kreibig,et al.  Autonomic nervous system activity in emotion: A review , 2010, Biological Psychology.

[2]  Steve DiPaola,et al.  Exploring Different Ways of Navigating Emotionally Responsive Artwork in Immersive Virtual Environments , 2015, EVA.

[3]  R. Lazarus Psychological stress and the coping process , 1970 .

[4]  Rosalind W Picard Recognizing Stress, Engagement, and Positive Emotion , 2015, IUI.

[5]  Grzegorz J. Nalepa,et al.  Uncertain context data management in dynamic mobile environments , 2017, Future Gener. Comput. Syst..

[6]  J. Prinz Gut Reactions: A Perceptual Theory of Emotion , 2004 .

[7]  Andrew Ortony,et al.  The Cognitive Structure of Emotions , 1988 .

[8]  Daniel McDuff,et al.  BioInsights: Extracting personal data from “Still” wearable motion sensors , 2015, 2015 IEEE 12th International Conference on Wearable and Implantable Body Sensor Networks (BSN).

[9]  A. Muaremi,et al.  Towards Measuring Stress with Smartphones and Wearable Devices During Workday and Sleep , 2013, BioNanoScience.

[10]  Grzegorz J. Nalepa,et al.  Uncertainty handling in rule-based mobile context-aware systems , 2017, Pervasive Mob. Comput..

[11]  Thomas Fritz,et al.  Stuck and Frustrated or in Flow and Happy: Sensing Developers' Emotions and Progress , 2015, 2015 IEEE/ACM 37th IEEE International Conference on Software Engineering.

[12]  Fabio Babiloni,et al.  How to Measure Cerebral Correlates of Emotions in Marketing Relevant Tasks , 2014, Cognitive Computation.

[13]  Tzyy-Ping Jung,et al.  The Wearable Multimodal Monitoring System: A Platform to Study Falls and Near-Falls in the Real-World , 2015, HCI.

[14]  G. Dockray,et al.  Gut reactions , 1981, Nature.

[15]  P. Young,et al.  Emotion and personality , 1963 .

[16]  Akane Sano,et al.  Stress Recognition Using Wearable Sensors and Mobile Phones , 2013, 2013 Humaine Association Conference on Affective Computing and Intelligent Interaction.