Gestural and facial communication with smart phone based robot partner using emotional model

When conducting natural communication in addition to perform verbal communication, a robot partner should also understand non-verbal communication such as facial and gestural information. The word “understand” for the robot means how to grasp the meaning of the gesture itself. In this paper we propose a smart phone based system, where an emotional model connects the facial and gestural communication of a human and a robot partner. The input of the emotional model is based on face classification and gesture recognition from the human side. Based on the emotional model, the output action such as gestural and facial expressions for the robot is calculated.

[1]  Naoyuki Kubota,et al.  Development platform for robot partners using smart phones , 2013, MHS2013.

[2]  Randolph R. Cornelius,et al.  The science of emotion: Research and tradition in the psychology of emotion. , 1997 .

[3]  Naoyuki Kubota,et al.  Emotional models for multi-modal communication of robot partners , 2013, 2013 IEEE International Symposium on Industrial Electronics.

[4]  Teuvo Kohonen,et al.  Self-Organizing Maps , 2010 .

[5]  Darrell Whitley,et al.  Genitor: a different genetic algorithm , 1988 .

[6]  Wulfram Gerstner,et al.  Spiking Neuron Models , 2002 .

[7]  Ross Buck,et al.  The communication of emotion , 1984 .

[8]  Pawan Sinha,et al.  Face Recognition by Humans: Nineteen Results All Computer Vision Researchers Should Know About , 2006, Proceedings of the IEEE.

[9]  M. Seif El-Nasr,et al.  A fuzzy emotional agent for decision-making in a mobile robot , 1998, 1998 IEEE International Conference on Fuzzy Systems Proceedings. IEEE World Congress on Computational Intelligence (Cat. No.98CH36228).

[10]  Naoyuki Kubota,et al.  Cognitive Development in Partner Robots for Information Support to Elderly People , 2011, IEEE Transactions on Autonomous Mental Development.

[11]  W. Marsden I and J , 2012 .

[12]  C. Fabrigoule,et al.  Neuropsychological correlates of self-reported performance in instrumental activities of daily living and prediction of dementia. , 1999, The journals of gerontology. Series B, Psychological sciences and social sciences.

[13]  Takenori Obo,et al.  Human gesture recognition for robot partners by spiking neural network and classification learning , 2012, The 6th International Conference on Soft Computing and Intelligent Systems, and The 13th International Symposium on Advanced Intelligence Systems.

[14]  Naoyuki Kubota,et al.  Emotional Model Based on Computational Intelligence for Partner Robots , 2010, Modeling Machine Emotions for Realizing Intelligence.

[15]  N. Kubota,et al.  Structured learning for partner robots based on natural communication , 2008, 2008 IEEE Conference on Soft Computing in Industrial Applications.

[16]  Sandra Weintraub,et al.  Neuropsychological Detection of Early Probable Alzheimer’s Disease , 2000 .

[17]  D. Sperber,et al.  Relevance: Communication and Cognition , 1989 .

[18]  Ming-Hsuan Yang,et al.  Learning Gender with Support Faces , 2002, IEEE Trans. Pattern Anal. Mach. Intell..

[19]  Takenori Obo,et al.  Human localization based on spiking neural network in intelligent sensor networks , 2011, 2011 IEEE Workshop on Robotic Intelligence In Informationally Structured Space.

[20]  Christoph Bartneck,et al.  Subtle emotional expressions of synthetic characters , 2005, Int. J. Hum. Comput. Stud..

[21]  Gilbert Syswerda,et al.  A Study of Reproduction in Generational and Steady State Genetic Algorithms , 1990, FOGA.

[22]  Naoyuki Kubota,et al.  Computational intelligence for structured learning of a partner robot based on imitation , 2005, Inf. Sci..

[23]  Naoyuki Kubota,et al.  Attention Allocation for Multi-Modal Perception of Human-Friendly Robot Partners , 2013, IFAC HMS.

[24]  Christopher J. Bishop,et al.  Pulsed Neural Networks , 1998 .

[25]  D. Sperber,et al.  Relevance: Communication and Cognition , 1997 .