Interpretation of user’s feedback in human-robot interaction

In this paper we will propose the use of social robots as interface between users and services in a Smart Environment. We will focus on the need for a robot to recognize the user’s feedback, in order to respond and revise its behaviour according to user’s needs. As we believe speech is a natural and immediate input channel in human-robot interaction, we will discuss the importance of recognising, besides the linguistic content of the spoken sentence, the attitude of the user towards the robot and the environment. In this way, the meaning of the user dialog will be made clear when hardly recognisable by the analysis of the utterance structure. Then, we will present the results of the application of a potential approach used for integrating the linguistic analysis with the recognition of the valence and arousal of the user’s utterance. In order to achieve this goal, we collected and analysed a corpus of data to build an interpretation model based on a Bayesian network. Then we tested the accuracy of the model using a test dataset. Results will show that the integration of the linguistic content with the recognition of some acoustic features of spoken sentences perform better in recognising the key aspects of user’s feedback.

[1]  A. Koller,et al.  Speech Acts: An Essay in the Philosophy of Language , 1969 .

[2]  I. Poggi,et al.  Il parlato emotivo. Aspetti cognitivi, linguistici e fonetici , 2004 .

[3]  Lola Cañamero,et al.  Attachment bonds for Human-like Robots , 2006, Int. J. Humanoid Robotics.

[4]  Julia Hirschberg,et al.  Classifying subject ratings of emotional speech using acoustic features , 2003, INTERSPEECH.

[5]  Nati Herrasti,et al.  GENIO: an ambient intelligence application in home automation and entertainment environment , 2005, sOc-EUSAI '05.

[6]  Shrikanth S. Narayanan,et al.  Combining acoustic and language information for emotion recognition , 2002, INTERSPEECH.

[7]  Sebastiano Pizzutilo,et al.  Social robots as mediators between users and smart environments , 2007, IUI '07.

[8]  Hong Jiang,et al.  From rational to emotional agents , 2007 .

[9]  A. Fernald Four-Month-Old Infants Prefer to Listen to Motherese" , 1985 .

[10]  Diane J. Litman,et al.  Towards Emotion Prediction in Spoken Tutoring Dialogues , 2003, HLT-NAACL.

[11]  Sebastiano Pizzutilo,et al.  Agent-Based Home Simulation and Control , 2005, ISMIS.

[12]  Jean-Claude Martin,et al.  Experimental Evaluation of Bi-directional Multimodal Interaction with Conversational Agents , 2003, INTERACT.

[13]  Timothy W. Bickmore,et al.  Establishing and maintaining long-term human-computer relationships , 2005, TCHI.

[14]  Nicole Novielli,et al.  'You are Sooo Cool, Valentina!' Recognizing Social Attitude in Speech-Based Dialogues with an ECA , 2007, ACII.

[15]  Sebastiano Pizzutilo,et al.  A Butler Agent for Personalized House Control , 2006, ISMIS.

[16]  Cynthia Breazeal,et al.  Toward sociable robots , 2003, Robotics Auton. Syst..

[17]  Sharon Oviatt,et al.  Designing and evaluating conversational interfaces with animated characters , 2001 .

[18]  Elisabeth André,et al.  Exploiting emotions to disambiguate dialogue acts , 2004, IUI '04.

[19]  Jean Carletta,et al.  Assessing Agreement on Classification Tasks: The Kappa Statistic , 1996, CL.

[20]  Dana Kulic,et al.  Estimating intent for human-robot interaction , 2003 .

[21]  Cynthia Breazeal,et al.  Recognition of Affective Communicative Intent in Robot-Directed Speech , 2002, Auton. Robots.

[22]  Kerstin Dautenhahn,et al.  ROBOTS AS SOCIAL ACTORS: AURORA AND THE CASE OF AUTISM , 1999 .

[23]  J. Austin How to do things with words , 1962 .

[24]  Roland Siegwart,et al.  On developing a voice-enabled interface for interactive tour-guide robots , 2003, Adv. Robotics.

[25]  K. Scherer,et al.  Acoustic profiles in vocal emotion expression. , 1996, Journal of personality and social psychology.

[26]  Oudeyer Pierre-Yves,et al.  The production and recognition of emotions in speech: features and algorithms , 2003 .

[27]  Elmar Nöth,et al.  “You Stupid Tin Box” - Children Interacting with the AIBO Robot: A Cross-linguistic Emotional Speech Corpus , 2004, LREC.

[28]  Annika Wærn,et al.  Simulation-based dialogue design for speech-controlled telephone services , 1995, CHI '95.

[29]  Sjl Mozziconacci,et al.  Role of intonation patterns in conveying emotion in speech , 1999 .

[30]  G. van Wichert,et al.  Man-machine interaction for robot applications in everyday environments , 2001, Proceedings 10th IEEE International Workshop on Robot and Human Interactive Communication. ROMAN 2001 (Cat. No.01TH8591).

[31]  Francesco Vatalaro,et al.  Ambient Intelligence: The Evolution of Technology, Communication and Cognition Towards the Future of Human-Computer Interaction , 2005 .

[32]  Johanna D. Moore,et al.  Fish or Fowl:A Wizard of Oz Evaluation of Dialogue Strategies in the Restaurant Domain , 2002, LREC.

[33]  Julia Hirschberg,et al.  A Framework for Eliciting Emotional Speech: Capitalizing on the Actor’s Process , 2006 .

[34]  Christoph Bartneck,et al.  Subtle emotional expressions of synthetic characters , 2005, Int. J. Hum. Comput. Stud..