Multimodal Affect Modeling and Recognition for Empathic Robot Companions
暂无分享,去创建一个
Ana Paiva | Carlos Martinho | Peter W. McOwan | Ginevra Castellano | Iolanda Leite | André Pereira | P. McOwan | Ana Paiva | Iolanda Leite | André Pereira | Ginevra Castellano | C. Martinho
[1] J. Russell. A circumplex model of affect. , 1980 .
[2] N. Borgers,et al. Children as Respondents in Survey Research: Cognitive Development and Response Quality 1 , 2000 .
[3] Candace L. Sidner,et al. Where to look: a study of human-robot engagement , 2004, IUI '04.
[4] Maja Pantic,et al. Web-based database for facial expression analysis , 2005, 2005 IEEE International Conference on Multimedia and Expo.
[5] Ashish Kapoor,et al. Multimodal affect recognition in learning environments , 2005, ACM Multimedia.
[6] Xue Yan,et al. iCat: an animated user-interface robot with personality , 2005, AAMAS '05.
[7] Ana Paiva,et al. Using Anticipation to Create Believable Behaviour , 2006, AAAI.
[8] Chih-Jen Lin,et al. Combining SVMs with Various Feature Selection Strategies , 2006, Feature Extraction.
[9] Kerstin Dautenhahn,et al. Socially intelligent robots: dimensions of human–robot interaction , 2007, Philosophical Transactions of the Royal Society B: Biological Sciences.
[10] Maja J. Mataric,et al. Investigating Implicit Cues for User State Estimation in Human-Robot Interaction Using Physiological Measurements , 2007, RO-MAN 2007 - The 16th IEEE International Symposium on Robot and Human Interactive Communication.
[11] Kostas Karpouzis,et al. The HUMAINE Database: Addressing the Collection and Annotation of Naturalistic and Induced Emotional Data , 2007, ACII.
[12] I. Poggi. Mind, hands, face and body. A goal and belief view of multimodal communication , 2007 .
[13] Dana Kulic,et al. Affective State Estimation for Human–Robot Interaction , 2007, IEEE Transactions on Robotics.
[14] Fumihide Tanaka,et al. Socialization between toddlers and robots at an early childhood education center , 2007, Proceedings of the National Academy of Sciences.
[15] Ana Paiva,et al. Are emotional robots more fun to play with? , 2008, RO-MAN 2008 - The 17th IEEE International Symposium on Robot and Human Interactive Communication.
[16] Louis-Philippe Morency,et al. Context-based recognition during human interactions: automatic feature selection and encoding dictionary , 2008, ICMI '08.
[17] Adriana Tapus,et al. Socially Assistive Robots: The Link between Personality, Empathy, Physiological Signals, and Task Performance , 2008, AAAI Spring Symposium: Emotion, Personality, and Social Behavior.
[18] Changchun Liu,et al. Online Affect Detection and Robot Behavior Adaptation for Intervention of Children With Autism , 2008, IEEE Transactions on Robotics.
[19] P. McOwan,et al. Affect Recognition for Interactive Companions , 2008 .
[20] Ana Paiva,et al. Detecting user engagement with a robot companion using task and social interaction-based features , 2009, ICMI-MLMI '09.
[21] C. Breazeal. Role of expressive behaviour for robots that learn from people , 2009, Philosophical Transactions of the Royal Society B: Biological Sciences.
[22] Ana Paiva,et al. It's all in the game: Towards an affect sensitive and context aware game companion , 2009, 2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops.
[23] Ana Paiva,et al. As Time goes by: Long-term evaluation of social presence in robotic companions , 2009, RO-MAN 2009 - The 18th IEEE International Symposium on Robot and Human Interactive Communication.
[24] Ana Paiva,et al. Affect recognition for interactive companions: challenges and design in real world scenarios , 2009, Journal on Multimodal User Interfaces.
[25] Peter Robinson,et al. When my robot smiles at me Enabling human-robot rapport via real-time head gesture mimicry , 2009 .
[26] Zhihong Zeng,et al. A Survey of Affect Recognition Methods: Audio, Visual, and Spontaneous Expressions , 2009, IEEE Trans. Pattern Anal. Mach. Intell..
[27] Anton Nijholt,et al. Use of context in vision processing: an introduction to the UCVP 2009 workshop , 2009, UCVP '09.
[28] Kostas Karpouzis,et al. Investigating shared attention with a virtual agent using a gaze-based interface , 2010, Journal on Multimodal User Interfaces.
[29] Scherer,et al. On the use of actor portrayals in research on emotional expression , 2010 .
[30] Ana Paiva,et al. Inter-ACT: an affective and contextually rich multimodal video corpus for studying interaction with robots , 2010, ACM Multimedia.
[31] Christopher E. Peters,et al. Socially perceptive robots: Challenges and concerns , 2010 .
[32] Ana Paiva,et al. "Why Can't We Be Friends?" An Empathic Game Companion for Long-Term Interaction , 2010, IVA.
[33] Ramesh C. Jain,et al. Content without context is meaningless , 2010, ACM Multimedia.
[34] James C. Lester,et al. Modeling Learner Affect with Theoretically Grounded Dynamic Bayesian Networks , 2011, ACII.
[35] Bogdan Raducanu,et al. Long-term socially perceptive and interactive robot companions: challenges and future perspectives , 2011, ICMI '11.
[36] Piero Cosi,et al. Long-term human-robot interaction with young users , 2011, HRI 2011.
[37] Georgios N. Yannakakis,et al. Mining multimodal sequential patterns: a case study on affect detection , 2011, ICMI '11.
[38] Ana Paiva,et al. Automatic analysis of affective postures and body motion to detect engagement with a game companion , 2011, 2011 6th ACM/IEEE International Conference on Human-Robot Interaction (HRI).
[39] Ana Paiva,et al. Long-Term Interactions with Empathic Robots: Evaluating Perceived Support in Children , 2012, ICSR.
[40] Ana Paiva,et al. Modelling empathic behaviour in a robotic game companion for children: An ethnographic study in real-world settings , 2012, 2012 7th ACM/IEEE International Conference on Human-Robot Interaction (HRI).
[41] Ana Paiva,et al. Detecting Engagement in HRI: An Exploration of Social and Task-Based Context , 2012, 2012 International Conference on Privacy, Security, Risk and Trust and 2012 International Confernece on Social Computing.
[42] Maja Pantic,et al. This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. IEEE TRANSACTIONS ON AFFECTIVE COMPUTING , 2022 .