User Affect Elicitation with a Socially Emotional Robot
暂无分享,去创建一个
Beno Benhabib | Goldie Nejat | Mingyang Shao | Matt Snyder | B. Benhabib | G. Nejat | Mingyang Shao | Matt Snyder
[1] T. Wheatley,et al. Music and movement share a dynamic structure that supports universal expressions of emotion , 2012, Proceedings of the National Academy of Sciences.
[2] Veronica Sundstedt,et al. The Effect of Emotions and Social Behavior on Performance in a Collaborative Serious Game Between Humans and Autonomous Robots , 2018, Int. J. Soc. Robotics.
[3] Pedro B. Albuquerque,et al. Emotional Induction Through Music: Measuring Cardiac and Electrodermal Responses of Emotional States and Their Persistence , 2019, Front. Psychol..
[4] Yi-Hsuan Yang,et al. 1000 songs for emotional analysis of music , 2013, CrowdMM '13.
[5] Bilge Mutlu,et al. Embodiment in Socially Interactive Robots , 2019, Found. Trends Robotics.
[6] T. Jung,et al. Fusion of electroencephalographic dynamics and musical contents for estimating emotional responses in music listening , 2014, Front. Neurosci..
[7] Andrzej Cichocki,et al. EmotionMeter: A Multimodal Framework for Recognizing Human Emotions , 2019, IEEE Transactions on Cybernetics.
[8] Rafael Ramírez,et al. Detecting Emotion from EEG Signals Using the Emotive Epoc Device , 2012, Brain Informatics.
[9] Ana Paiva,et al. Automatic analysis of affective postures and body motion to detect engagement with a game companion , 2011, 2011 6th ACM/IEEE International Conference on Human-Robot Interaction (HRI).
[10] Zhenqi Li,et al. A Review of Emotion Recognition Using Physiological Signals , 2018, Sensors.
[11] Nicole Novielli,et al. Emotion detection using noninvasive low cost sensors , 2017, 2017 Seventh International Conference on Affective Computing and Intelligent Interaction (ACII).
[12] Beno Benhabib,et al. You Are Doing Great! Only One Rep Left: An Affect-Aware Social Robot for Exercising , 2019, 2019 IEEE International Conference on Systems, Man and Cybernetics (SMC).
[13] K. Scherer. What are emotions? And how can they be measured? , 2005 .
[14] B. Geethanjali,et al. Evaluating the Induced Emotions on Physiological Response , 2018 .
[15] Sidney K. D'Mello,et al. Affect Elicitation for Affective Computing , 2015 .
[16] Mohammad Soleymani,et al. CROSS-CORPUS EEG-BASED EMOTION RECOGNITION , 2018, 2018 IEEE 28th International Workshop on Machine Learning for Signal Processing (MLSP).
[17] M. Bradley,et al. Measuring emotion: the Self-Assessment Manikin and the Semantic Differential. , 1994, Journal of behavior therapy and experimental psychiatry.
[18] Subramanian Ramanathan,et al. DECAF: MEG-Based Multimodal Database for Decoding Affective Physiological Responses , 2015, IEEE Transactions on Affective Computing.
[19] Abeer Al-Nafjan,et al. Classification of Human Emotions from Electroencephalogram (EEG) Signal using Deep Neural Network , 2017 .
[20] S. Koelsch. Towards a neural basis of music-evoked emotions , 2010, Trends in Cognitive Sciences.
[21] Ikuo Mizuuchi,et al. A situation-aware action selection based on individual's preference using emotion estimation , 2014, 2014 IEEE International Conference on Robotics and Biomimetics (ROBIO 2014).
[22] Christian Mühl,et al. Valence, arousal and dominance in the EEG during game play , 2013, Int. J. Auton. Adapt. Commun. Syst..
[23] Erik Cambria,et al. Affective Computing and Sentiment Analysis , 2016, IEEE Intelligent Systems.
[24] Ana Paiva,et al. Affect recognition for interactive companions: challenges and design in real world scenarios , 2009, Journal on Multimodal User Interfaces.
[25] Goldie Nejat,et al. Affect detection from body language during social HRI , 2012, 2012 IEEE RO-MAN: The 21st IEEE International Symposium on Robot and Human Interactive Communication.
[26] B. Calvo-Merino,et al. Enhancing emotional experiences to dance through music: the role of valence and arousal in the cross-modal bias , 2014, Front. Hum. Neurosci..
[27] Goldie Nejat,et al. A Social Robot Learning to Facilitate an Assistive Group-Based Activity from Non-expert Caregivers , 2020, Int. J. Soc. Robotics.
[28] Goldie Nejat,et al. Classifying a Person’s Degree of Accessibility From Natural Body Language During Social Human–Robot Interactions , 2017, IEEE Transactions on Cybernetics.
[29] Shihong Lao,et al. Vision-Based Face Understanding Technologies and Their Applications , 2004, SINOBIOMETRICS.
[30] Antoni Gomila,et al. A Norming Study and Library of 203 Dance Movements , 2014, Perception.
[31] Goldie Nejat,et al. Tangy the Robot Bingo Facilitator: A Performance Review , 2015 .
[32] Tanja Schultz,et al. Towards an EEG-based emotion recognizer for humanoid robots , 2009, RO-MAN 2009 - The 18th IEEE International Symposium on Robot and Human Interactive Communication.
[33] Hichem Sahli,et al. Natural emotion elicitation for emotion modeling in child-robot interactions , 2014, WOCCI.
[34] Beno Benhabib,et al. A Survey of Autonomous Human Affect Detection Methods for Social Robots Engaged in Natural HRI , 2016, J. Intell. Robotic Syst..
[35] Matthias Scheutz,et al. Reflections on the Design Challenges Prompted by Affect-Aware Socially Assistive Robots , 2017, Emotions and Personality in Personalized Services.
[36] Areej Al-Wabil,et al. Review and Classification of Emotion Recognition Based on EEG Brain-Computer Interface System Research: A Systematic Review , 2017 .
[37] Goldie Nejat,et al. Meal-time with a socially assistive robot and older adults at a long-term care facility , 2013, HRI 2013.
[38] Ikuo Mizuuchi,et al. Elicitation of Specific Facial Expression by Robot's Action , 2015 .
[39] Y. Trope,et al. Body Cues, Not Facial Expressions, Discriminate Between Intense Positive and Negative Emotions , 2012, Science.
[40] Goldie Nejat,et al. A focus group study on the design considerations and impressions of a socially assistive robot for long-term care , 2014, The 23rd IEEE International Symposium on Robot and Human Interactive Communication.
[41] Yuta Katsumi,et al. The role of arousal in the spontaneous regulation of emotions in healthy aging: a fMRI investigation , 2014, Front. Psychol..
[42] Dana Kulic,et al. Affective State Estimation for Human–Robot Interaction , 2007, IEEE Transactions on Robotics.
[43] Yuan-Pin Lin,et al. EEG-Based Emotion Recognition in Music Listening , 2010, IEEE Transactions on Biomedical Engineering.
[44] K. Scherer,et al. Bodily expression of emotion , 2009 .
[45] Beno Benhabib,et al. A Socially Assistive Robot to Help With Getting Dressed , 2017 .
[46] Beno Benhabib,et al. Personalized clothing recommendation by a social robot , 2017, 2017 IEEE International Symposium on Robotics and Intelligent Sensors (IRIS).
[47] Gaël Varoquaux,et al. Scikit-learn: Machine Learning in Python , 2011, J. Mach. Learn. Res..
[48] Thierry Pun,et al. DEAP: A Database for Emotion Analysis ;Using Physiological Signals , 2012, IEEE Transactions on Affective Computing.
[49] Daniel J. Levitin,et al. Cross-modal interactions in the experience of musical performances: Physiological correlates , 2008, Cognition.
[50] Naeem Ramzan,et al. DREAMER: A Database for Emotion Recognition Through EEG and ECG Signals From Wireless Low-cost Off-the-Shelf Devices , 2018, IEEE Journal of Biomedical and Health Informatics.
[51] Olga Sourina,et al. Real-time EEG-based emotion monitoring using stable features , 2015, The Visual Computer.
[52] S. Langenecker,et al. Emotion regulation through execution, observation, and imagery of emotional movements , 2013, Brain and Cognition.
[53] Goldie Nejat,et al. How Robots Influence Humans: A Survey of Nonverbal Communication in Social Human–Robot Interaction , 2019, International Journal of Social Robotics.
[54] Bin Hu,et al. Exploring EEG Features in Cross-Subject Emotion Recognition , 2018, Front. Neurosci..
[55] ReuderinkBoris,et al. Valence, arousal and dominance in the EEG during game play , 2013 .
[56] Koen V. Hindriks,et al. Effects of bodily mood expression of a robotic teacher on students , 2014, 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems.
[57] Jennifer Healey. Physiological Sensing of Emotion , 2015 .
[58] Britta Wrede,et al. Social facilitation with social robots? , 2012, 2012 7th ACM/IEEE International Conference on Human-Robot Interaction (HRI).
[59] Goldie Nejat,et al. Promoting Interactions Between Humans and Robots Using Robotic Emotional Behavior , 2016, IEEE Transactions on Cybernetics.
[60] Anna Esposito,et al. Introduction to the Special Issue “Beyond Industrial Robotics: Social Robots Entering Public and Domestic Spheres” , 2015, Inf. Soc..
[61] Andrew P. Bradley,et al. The use of the area under the ROC curve in the evaluation of machine learning algorithms , 1997, Pattern Recognit..
[62] Stephen M Thielke,et al. Decline in health for older adults: five-year change in 13 key measures of standardized health. , 2013, The journals of gerontology. Series A, Biological sciences and medical sciences.
[63] L. Aftanas,et al. Affective picture processing: event-related synchronization within individually defined human theta band is modulated by valence dimension. , 2001, Neuroscience Letters.
[64] Corinne Jola,et al. The experience of watching dance: phenomenological–neuroscience duets , 2012 .
[65] Yan Ge,et al. Frontal EEG Asymmetry and Middle Line Power Difference in Discrete Emotions , 2018, Front. Behav. Neurosci..
[66] A. Nijholt,et al. A survey of affective brain computer interfaces: principles, state-of-the-art, and challenges , 2014 .
[67] Beno Benhabib,et al. A Multimodal Emotional Human–Robot Interaction Architecture for Social Robots Engaged in Bidirectional Communication , 2020, IEEE Transactions on Cybernetics.
[68] K. R. Seeja,et al. Subject independent emotion recognition from EEG using VMD and deep learning , 2019, J. King Saud Univ. Comput. Inf. Sci..
[69] Goldie Nejat,et al. Determining the affective body language of older adults during socially assistive HRI , 2014, 2014 IEEE/RSJ International Conference on Intelligent Robots and Systems.
[70] Goldie Nejat,et al. A Socially Assistive Robot That Can Monitor Affect of the Elderly During Mealtime Assistance , 2014 .
[71] Carlos Busso,et al. The USC CreativeIT database of multimodal dyadic interactions: from speech and full body motion capture to continuous emotional annotations , 2015, Language Resources and Evaluation.