Modélisation du profil émotionnel de l'utilisateur dans les interactions parlées Humain-Machine. (User's emotional profile modelling in spoken Human-Machine interactions)

Les travaux de recherche de la these portent sur l'etude et la formalisation des interactions emotionnelles Humain-Machine. Au dela d’une detection d'informations paralinguistiques (emotions, disfluences,...) ponctuelles, il s'agit de fournir au systeme un profil interactionnel et emotionnel de l'utilisateur dynamique, enrichi pendant l’interaction. Ce profil permet d’adapter les strategies de reponses de la machine au locuteur, et il peut egalement servir pour mieux gerer des relations a long terme. Le profil est fonde sur une representation multi-niveau du traitement des indices emotionnels et interactionnels extraits a partir de l'audio via les outils de detection des emotions du LIMSI. Ainsi, des indices bas niveau (variations de la F0, d'energie, etc.), fournissent des informations sur le type d'emotion exprimee, la force de l'emotion, le degre de loquacite, etc. Ces elements a moyen niveau sont exploites dans le systeme afin de determiner, au fil des interactions, le profil emotionnel et interactionnel de l'utilisateur. Ce profil est compose de six dimensions : optimisme, extraversion, stabilite emotionnelle, confiance en soi, affinite et domination (base sur le modele de personnalite OCEAN et les theories de l’interpersonal circumplex). Le comportement social du systeme est adapte en fonction de ce profil, de l'etat de la tâche en cours, et du comportement courant du robot. Les regles de creation et de mise a jour du profil emotionnel et interactionnel, ainsi que de selection automatique du comportement du robot, ont ete implementees en logique floue a l'aide du moteur de decision developpe par un partenaire du projet ROMEO. L’implementation du systeme a ete realisee sur le robot NAO. Afin d’etudier les differents elements de la boucle d’interaction emotionnelle entre l’utilisateur et le systeme, nous avons participe a la conception de plusieurs systemes : systeme en Magicien d’Oz pre-scripte, systeme semi-automatise, et systeme d’interaction emotionnelle autonome. Ces systemes ont permis de recueillir des donnees en controlant plusieurs parametres d’elicitation des emotions au sein d’une interaction ; nous presentons les resultats de ces experimentations, et des protocoles d’evaluation de l’Interaction Humain-Robot via l’utilisation de systemes a differents degres d’autonomie.

[1]  Fabio Tesser,et al.  Children Interpretation of Emotional Body Language Displayed by a Robot , 2011, ICSR.

[2]  Radoslaw Niewiadomski,et al.  Expressions intelligentes des émotions , 2006, Rev. d'Intelligence Artif..

[3]  Andrew Ortony,et al.  The Cognitive Structure of Emotions , 1988 .

[4]  Björn W. Schuller,et al.  Paralinguistics in speech and language - State-of-the-art and the challenge , 2013, Comput. Speech Lang..

[5]  Allan Mazur,et al.  A Biosocial Model of Status in Face-to-Face Primate Groups , 1985 .

[6]  Magalie Ochs,et al.  Simulation of the Dynamics of Nonplayer Characters' Emotions and Social Relations in Games , 2009, IEEE Transactions on Computational Intelligence and AI in Games.

[7]  Hans J. Eysenck,et al.  Manual of the Eysenck personality questionnaire , 1975 .

[8]  Elmar Nöth,et al.  “You Stupid Tin Box” - Children Interacting with the AIBO Robot: A Cross-linguistic Emotional Speech Corpus , 2004, LREC.

[9]  Roland Siegwart,et al.  What do people expect from robots? , 2008, 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[10]  J. Russell,et al.  Evidence for a three-factor theory of emotions , 1977 .

[11]  Brian R. Duffy,et al.  Anthropomorphism and the social robot , 2003, Robotics Auton. Syst..

[12]  Ivana Kruijff-Korbayová,et al.  Children's adaptation in multi-session interaction with a humanoid robot , 2012, 2012 IEEE RO-MAN: The 21st IEEE International Symposium on Robot and Human Interactive Communication.

[13]  Maja J. Mataric,et al.  The role of physical embodiment in human-robot interaction , 2006, ROMAN 2006 - The 15th IEEE International Symposium on Robot and Human Interactive Communication.

[14]  Catherine Pelachaud,et al.  A formal model of emotions for an empathic rational dialog agent , 2012, Autonomous Agents and Multi-Agent Systems.

[15]  Anders Green,et al.  Developing a ContextualizedMultimodal Corpus for Human-Robot Interaction , 2006, LREC.

[16]  Karin Aronsson,et al.  Response cries and other gaming moves—Building intersubjectivity in gaming , 2009 .

[17]  Laurence Devillers,et al.  Impact du Comportement Social d’un Robot sur les Emotions de l’Utilisateur : une Expérience Perceptive (Impact of the Social Behaviour of a Robot on the User’s Emotions: a Perceptive Experiment) [in French] , 2012, JEP/TALN/RECITAL.

[18]  J. Svennevig Getting acquainted in conversation , 1999 .

[19]  Aaron Powers,et al.  Matching robot appearance and behavior to tasks to improve human-robot cooperation , 2003, The 12th IEEE International Workshop on Robot and Human Interactive Communication, 2003. Proceedings. ROMAN 2003..

[20]  Lori Lamel,et al.  Challenges in real-life emotion annotation and machine learning based detection , 2005, Neural Networks.

[21]  Marilyn A. Walker,et al.  PARADISE: A Framework for Evaluating Spoken Dialogue Agents , 1997, ACL.

[22]  J. Cassell,et al.  SOCIAL DIALOGUE WITH EMBODIED CONVERSATIONAL AGENTS , 2005 .

[23]  E. Goffman,et al.  Forms of talk , 1982 .

[24]  L. Cañamero,et al.  Interpretation of emotional body language displayed by robots , 2010, AFFINE '10.

[25]  Matthieu Vernier,et al.  DEFT'09 : détection de la subjectivité et catégorisation de textes subjectifs par une approche mixte symbolique et statistique , 2009 .

[26]  Jean-Marc Odobez,et al.  Given that, should i respond? Contextual addressee estimation in multi-party human-robot interactions , 2013, 2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[27]  Mark H. Davis Measuring individual differences in empathy: Evidence for a multidimensional approach. , 1983 .

[28]  Timothy Leary,et al.  Interpersonal diagnosis of personality , 1957 .

[29]  Ana Paiva,et al.  Inter-ACT: an affective and contextually rich multimodal video corpus for studying interaction with robots , 2010, ACM Multimedia.

[30]  C. Breazeal,et al.  Robotic Partners ’ Bodies and Minds : An Embodied Approach to Fluid Human-Robot Collaboration , 2006 .

[31]  K. Scherer,et al.  Handbook of affective sciences. , 2003 .

[32]  C. Pelachaud,et al.  Emotion-Oriented Systems: The Humaine Handbook , 2011 .

[33]  Daniel Gatica-Perez,et al.  An Audio Visual Corpus for Emergent Leader Analysis , 2011 .

[34]  Cynthia Breazeal,et al.  Robot in Society: Friend or Appliance? , 1999 .

[35]  Axel Buendia,et al.  From Informative Cooperative Dialogues to Long-Term Social Relation with a Robot , 2012, Natural Interaction with Robots, Knowbots and Smartphones, Putting Spoken Dialog Systems into Practice.

[36]  Isabella Poggi,et al.  Persuasion Artifices to Promote Wellbeing , 2006, PERSUASIVE.

[37]  Yasser F. O. Mohammad,et al.  The H3R Explanation Corpus human-human and base human-robot interaction dataset , 2008, 2008 International Conference on Intelligent Sensors, Sensor Networks and Information Processing.

[38]  I. Altman,et al.  Social penetration: The development of interpersonal relationships , 1973 .

[39]  T. Singer,et al.  The empathic brain: how, when and why? , 2006, Trends in Cognitive Sciences.

[40]  Adriana Tapus,et al.  A model for synthesizing a combined verbal and nonverbal behavior based on personality traits in human-robot interaction , 2013, 2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[41]  Trevor Darrell,et al.  MULTIMODAL INTERFACES THAT Flex, Adapt, and Persist , 2004 .

[42]  Daniel L. Schwartz,et al.  Young Children's Understanding of Animacy and Entertainment Robots , 2006, Int. J. Humanoid Robotics.

[43]  R. McCrae,et al.  An introduction to the five-factor model and its applications. , 1992, Journal of personality.

[44]  Adriana Tapus,et al.  Emulating Empathy in Socially Assistive Robotics , 2007, AAAI Spring Symposium: Multidisciplinary Collaboration for Socially Assistive Robotics.

[45]  R. L. Thorndike Intellectual status and intellectual growth. , 1966, Journal of educational psychology.

[46]  S. Kiesler,et al.  Mental Models and Cooperation with Robotic Assistants , 2001 .

[47]  Maja Pantic,et al.  The SEMAINE corpus of emotionally coloured character interactions , 2010, 2010 IEEE International Conference on Multimedia and Expo.

[48]  Hélène Maynard,et al.  Méthodologies d'évaluation des systèmes de dialogue parlé : réflexions et expériences autour de la compréhension , 2002 .

[49]  K. Scherer Appraisal considered as a process of multilevel sequential checking. , 2001 .

[50]  Kornel Laskowski,et al.  Combining Efforts for Improving Automatic Classification of Emotional User States , 2006 .

[51]  Zhansheng Feng,et al.  A corpus for forensic identification of dialect speaker in some area , 2011, 2011 IEEE 3rd International Conference on Communication Software and Networks.

[52]  Marie Tahon,et al.  Real-Life Emotion Detection from Speech in Human-Robot Interaction: Experiments Across Diverse Corpora with Child and Adult Voices , 2011, INTERSPEECH.

[53]  Jean-Claude Martin,et al.  Coding Emotional Events in Audiovisual Corpora , 2008, LREC.

[54]  Heiko Wersing,et al.  Feedback interpretation based on facial expressions in human-robot interaction , 2009, RO-MAN 2009 - The 18th IEEE International Symposium on Robot and Human Interactive Communication.

[55]  Mohamed Chetouani,et al.  Voice and graphical -based interfaces for interaction with a robot dedicated to elderly and people with cognitive disorders , 2010, 19th International Symposium in Robot and Human Interactive Communication.

[56]  Klaus R. Scherer,et al.  Emotional expression in prosody: a review and an agenda for future research , 2004, Speech Prosody 2004.

[57]  C. Clavel,et al.  Analyse et reconnaissance des manifestations acoustiques des émotions de type peur en situations anormales , 2007 .

[58]  A. Nijholt Computational Humor 2012 , 2012 .

[59]  Britta Wrede,et al.  A computational model of emotional alignment , 2011 .

[60]  M. Tahon,et al.  Analyse acoustique de la voix émotionnelle de locuteurs lors d’une interaction humain-robot , 2012 .

[61]  Dirk Heylen,et al.  The Sensitive Artificial Listner: an induction technique for generating emotionally coloured conversation , 2008 .

[62]  Britta Wrede,et al.  BITT: A Corpus for Topic Tracking Evaluation on Multimodal Human-Robot-Interaction , 2006, LREC.

[63]  Ben J. A. Kröse,et al.  From Sensors to Human Spatial Concepts: An Annotated Data Set , 2008, IEEE Transactions on Robotics.

[64]  Justine Cassell,et al.  Relational agents: a model and implementation of building user trust , 2001, CHI.

[65]  Nicole C. Krämer,et al.  A survey on robot appearances , 2012, HRI '12.

[66]  Jean-Claude Martin,et al.  Collection and Annotation of a Corpus of Human-Human Multimodal Interactions: Emotion and Others Anthropomorphic Characteristics , 2007, ACII.

[67]  Laurence Devillers,et al.  Negative emotions detection as an indicator of dialogs quality in call centers , 2012, 2012 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP).

[68]  M. Fujita,et al.  Physically and Emotionally Grounded Symbol Acquisition for Autonomous Robots , 2001 .

[69]  Laurence Devillers,et al.  Reliability of Lexical and Prosodic Cues in Two Real-life Spoken Dialog Corpora , 2004, LREC.

[70]  L. Miller,et al.  Self-disclosure and liking: a meta-analytic review. , 1994, Psychological bulletin.

[71]  Cynthia Breazeal,et al.  Designing sociable robots , 2002 .

[72]  Julia Ipgrave,et al.  The language of friendship and identity: children’s communication choices in an interfaith exchange , 2009 .

[73]  Takayuki Kanda,et al.  How quickly should communication robots respond? , 2008, 2008 3rd ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[74]  Cory D. Kidd,et al.  A sociable robot to encourage social interaction among the elderly , 2006, Proceedings 2006 IEEE International Conference on Robotics and Automation, 2006. ICRA 2006..

[75]  Atsuo Takanishi,et al.  Application of neural network to humanoid robots--development of co-associative memory model , 2005, Neural Networks.

[76]  Andrea Lockerd Thomaz,et al.  An embodied computational model of social referencing , 2005, ROMAN 2005. IEEE International Workshop on Robot and Human Interactive Communication, 2005..

[77]  Cynthia Breazeal,et al.  Engaging robots: easing complex human-robot teamwork using backchanneling , 2013, CSCW.

[78]  Louis-Philippe Morency,et al.  Modeling Human Communication Dynamics , 2010 .

[79]  F. Irvin,et al.  Interaction Concepts of Personality. , 1969 .

[80]  Timothy V. Schafer Better Game Characters by Design: A Psychological Approach , 2006 .

[81]  Geraldine Clarebout,et al.  The contribution of learner characteristics in the development of computer-based adaptive learning environments , 2011, Comput. Hum. Behav..

[82]  Nadia Magnenat-Thalmann,et al.  Communicating with a virtual human or a skin-based robot head , 2008, 2008 8th IEEE International Conference on Automatic Face & Gesture Recognition.

[83]  Roddy Cowie,et al.  Real life emotions in French and English TV video clips: an integrated annotation protocol combining continuous and discrete approaches , 2006, LREC.

[84]  Atsuo Takanishi,et al.  A new mental model for humanoid robots for human friendly communication introduction of learning system, mood vector and second order equations of emotion , 2003, 2003 IEEE International Conference on Robotics and Automation (Cat. No.03CH37422).

[85]  Pierre-Yves Oudeyer,et al.  Autonomous reuse of motor exploration trajectories , 2013, 2013 IEEE Third Joint International Conference on Development and Learning and Epigenetic Robotics (ICDL).

[86]  Mark G. Core,et al.  Coding Dialogs with the DAMSL Annotation Scheme , 1997 .

[87]  L. Devillers,et al.  Use of nonverbal speech cues in social interaction between human and robot: emotional and interactional markers , 2010, AFFINE '10.

[88]  Takanori Shibata,et al.  Effects of robot therapy for demented patients evaluated by EEG , 2005, 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[89]  Jean-Marc Odobez,et al.  The vernissage corpus: A conversational Human-Robot-Interaction dataset , 2013, 2013 8th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[90]  A. M. Turing,et al.  Computing Machinery and Intelligence , 1950, The Philosophy of Artificial Intelligence.

[91]  Richard S. Lazarus,et al.  EMOTIONS: A COGNITIVE–PHENOMENOLOGICAL ANALYSIS , 1980 .

[92]  Laurence Vidrascu,et al.  Analyse et détection des émotions verbales dans les interactions orales. (Analysis and detection of emotions in real-life spontaneous speech) , 2007 .

[93]  Sandrine Balbo,et al.  Evaluation ergonomique des interfaces utilisateur : un pas vers l'automatisation. (Ergonomic evaluation of user interfaces : a step to automation) , 1994 .

[94]  Matthias Scheutz,et al.  The Indiana “Cooperative Remote Search Task” (CReST) Corpus , 2010, LREC.

[95]  Maja J. Matarić,et al.  Towards Socially Assistive Robotics , 2006 .

[96]  Sylvie Pesty,et al.  Acceptability in Interaction - From Robots to Embodied Conversational Agents , 2011, GRAPP.

[97]  Pierre-Yves Oudeyer,et al.  Poppy: a new bio-inspired humanoid robot platform , 2013 .

[98]  S. Baron-Cohen,et al.  The Empathy Quotient: An Investigation of Adults with Asperger Syndrome or High Functioning Autism, and Normal Sex Differences , 2004, Journal of autism and developmental disorders.

[99]  Ananth N. Iyer,et al.  Emotion Detection From Infant Facial Expressions And Cries , 2006, 2006 IEEE International Conference on Acoustics Speech and Signal Processing Proceedings.

[100]  Stacy Marsella,et al.  A domain-independent framework for modeling emotion , 2004, Cognitive Systems Research.

[101]  Alessandro Vinciarelli,et al.  Automatic personality perception: Prediction of trait attribution based on prosodic features extended abstract , 2015, 2015 International Conference on Affective Computing and Intelligent Interaction (ACII).

[102]  Laurence Devillers,et al.  Protocol CINEMO: The use of fiction for collecting emotional data in naturalistic controlled oriented context , 2009, 2009 3rd International Conference on Affective Computing and Intelligent Interaction and Workshops.

[103]  J. Rivière,et al.  Un ACA sincère comme compagnon artificiel , 2012 .

[104]  A. Fernald Human maternal vocalizations to infants as biologically relevant signals: An evolutionary perspective. , 1992 .

[105]  L. R. Goldberg,et al.  Integration of the big five and circumplex approaches to trait structure. , 1992, Journal of personality and social psychology.

[106]  S. Srivastava,et al.  The Big Five Trait taxonomy: History, measurement, and theoretical perspectives. , 1999 .

[107]  Laurence Devillers,et al.  Real-life emotions detection with lexical and paralinguistic cues on human-human call center dialogs , 2006, INTERSPEECH.

[108]  Toyoaki Nishida,et al.  Modelling Interaction Dynamics during Face-to-Face Interactions , 2010, Modeling Machine Emotions for Realizing Intelligence.

[109]  Takayuki Kanda,et al.  Interactive Robots as Social Partners and Peer Tutors for Children: A Field Trial , 2004, Hum. Comput. Interact..

[110]  Cynthia Breazeal,et al.  Crowdsourcing human-robot interaction: Application from virtual to physical worlds , 2011, 2011 RO-MAN.

[111]  Stanislao Lauria,et al.  A corpus-based analysis of route instructions in human-robot interaction , 2009 .

[112]  Dirk Heylen,et al.  Bridging the Gap between Social Animal and Unsocial Machine: A Survey of Social Signal Processing , 2012, IEEE Transactions on Affective Computing.

[113]  G. L. Trager Paralanguage : A first approximation , 1958 .

[114]  Maja J. Mataric,et al.  Using Socially Assistive Human–Robot Interaction to Motivate Physical Exercise for Older Adults , 2012, Proceedings of the IEEE.

[115]  Alessandro Vinciarelli,et al.  The voice of personality: mapping nonverbal vocal behavior into trait attributions , 2010, SSPW '10.

[116]  David Sander,et al.  Traité de psychologie des émotions , 2014 .

[117]  Kasia Muldner,et al.  Emotion Sensors Go To School , 2009, AIED.

[118]  L. Devillers,et al.  A Wizard-of-Oz game for collecting emotional audio data in a children-robot interaction , 2009, AFFINE '09.

[119]  Tetsuo Ono,et al.  A humanoid robot that pretends to listen to route guidance from a human , 2007, Auton. Robots.

[120]  Matthew R. Walter,et al.  Understanding Natural Language Commands for Robotic Navigation and Mobile Manipulation , 2011, AAAI.