An Architecture with a Mobile Phone Interface for the Interaction of a Human with a Humanoid Robot Expressing Emotions and Personality

In this paper is illustrated the cognitive architecture of a humanoid robot based on the proposed paradigm of Latent Semantic Analysis (LSA). This paradigm is a step towards the simulation of an emotional behavior of a robot interacting with humans. The LSA approach allows the creation and the use of a data driven high-dimensional conceptual space. We developed an architecture based on three main areas: Sub-conceptual, Emotional and Behavioral. The first area analyzes perceptual data coming from the sensors. The second area builds the sub-symbolic representation of emotions in a conceptual space of emotional states. The last area triggers a latent semantic behavior which is related to the humanoid emotional state. The robot shows its overall behavior also taking into account its "personality". We implemented the system on a Aldebaran NAO humanoid robot and we tested the emotional interaction with humans through the use of a mobile phone as an interface.

[1]  Antonio Cisternino,et al.  3D Models of Humanoid Soccer Robot in USARSim and Robotics Studio Simulators , 2008, Int. J. Humanoid Robotics.

[2]  Illah R. Nourbakhsh,et al.  The role of expressiveness and attention in human-robot interaction , 2002, Proceedings 2002 IEEE International Conference on Robotics and Automation (Cat. No.02CH37292).

[3]  Giovanni Pilato,et al.  A Conceptual Probabilistic Model for the Induction of Image Semantics , 2010, 2010 IEEE Fourth International Conference on Semantic Computing.

[4]  Salvatore Gaglio,et al.  An architecture for autonomous agents exploiting conceptual representations , 1998, Robotics Auton. Syst..

[5]  Salvatore Gaglio,et al.  A cognitive architecture for robot self-consciousness , 2008, Artif. Intell. Medicine.

[6]  Stefania Bandini,et al.  AI*IA 2005: Advances in Artificial Intelligence, 9th Congress of the Italian Association for Artificial Intelligence, Milan, Italy, September 21-23, 2005, Proceedings , 2005, AI*IA.

[7]  Mitsuru Ishizuka,et al.  SentiFul: A Lexicon for Sentiment Analysis , 2011, IEEE Transactions on Affective Computing.

[8]  D. Gentner,et al.  The analogical mind : perspectives from cognitive science , 2001 .

[9]  Jérôme Monceaux,et al.  Demonstration: first steps in emotional expression of the humanoid robot Nao , 2009, ICMI-MLMI '09.

[10]  Zhiliang Wang,et al.  Emotional gait generation for a humanoid robot , 2010, Int. J. Autom. Comput..

[11]  Hichem Sahli,et al.  CoRoBa, a Multi Mobile Robot Control and Simulation Framework , 2006 .

[12]  Giovanni Pilato,et al.  An Emphatic Humanoid Robot with Emotional Latent Semantic Behavior , 2008, SIMPAR.

[13]  Paolo Dario,et al.  Effective emotional expressions with expression humanoid robot WE-4RII: integration of humanoid robot hand RCH-1 , 2004, 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566).

[14]  Mitsuru Ishizuka,et al.  MPML3D: Scripting Agents for the 3D Internet , 2011, IEEE Transactions on Visualization and Computer Graphics.

[15]  Peter W. Foltz,et al.  An introduction to latent semantic analysis , 1998 .

[16]  Ronald C. Arkin,et al.  Beyond Humanoid Emotions: Incorporating Traits, Attitudes and Moods , 2009 .

[17]  Masahiro Fujita,et al.  An ethological and emotional basis for human-robot interaction , 2003, Robotics Auton. Syst..

[18]  Cynthia Breazeal,et al.  Emotion and sociable humanoid robots , 2003, Int. J. Hum. Comput. Stud..

[19]  Zhen Liu,et al.  An Emotion Model of 3D Virtual Characters in Intelligent Virtual Environment , 2005, ACII.

[20]  Giovanni Pilato,et al.  A Conversational Agent Based on a Conceptual Interpretation of a Data Driven Semantic Space , 2005, AI*IA.