A multimodal affective computing approach for children companion robots

This paper describes the approach of multimodal affective computing fusion for children companion robots. Our approach presents the affective computing fusion model that processes both verbal and nonverbal information of users; the advanced methodology can improve social robots, especially children companion robots' emotion recognition and classification ability and enhance the experience of immediate visual interaction between children and robots.

[1]  Marc A. Brackett,et al.  Enhancing academic performance and social and emotional competence with the RULER feeling words curriculum , 2012 .

[2]  A. Mehrabian Pleasure-arousal-dominance: A general framework for describing and measuring individual differences in Temperament , 1996 .

[3]  Marc Hanheide,et al.  Evaluation and Discussion of Multi-modal Emotion Recognition , 2009, 2009 Second International Conference on Computer and Electrical Engineering.

[4]  Andrew Ortony,et al.  The Cognitive Structure of Emotions , 1988 .

[5]  S. Shimojo,et al.  Sensory modalities are not separate modalities: plasticity and interactions , 2001, Current Opinion in Neurobiology.

[6]  Adrian David Cheok,et al.  Alice and Her Friend: A Black "Picture Book" of Multisensory Interaction for Visually-Impaired Children , 2016, ACE.

[7]  Michael Pauen,et al.  Emotion, Decision, and Mental Models , 2006 .

[8]  Marc A. Brackett,et al.  Creating Emotionally Intelligent Schools With RULER , 2016 .

[9]  Guy Hoffman,et al.  Comparing Social Robot, Screen and Voice Interfaces for Smart-Home Control , 2017, CHI.

[10]  Tom Yeh,et al.  3D printed tactile picture books for children with visual impairments: a design probe , 2014, IDC.

[11]  Jianhua Tao,et al.  Multimodal Information Processing for Affective Computing , 2010 .

[12]  Mohamed Jamal Zemerly,et al.  MyVision AIR: An augmented interactive reality book mobile application , 2016, 2016 IEEE Global Engineering Education Conference (EDUCON).

[13]  M. Argyle Bodily communication, 2nd ed. , 1988 .

[14]  José Manuel Pastor,et al.  Software Architecture for Smart Emotion Recognition and Regulation of the Ageing Adult , 2016, Cognitive Computation.

[15]  Anton Nijholt,et al.  Brain–Computer Interfaces for Multimodal Interaction: A Survey and Principles , 2012, Int. J. Hum. Comput. Interact..

[16]  Ibrahim A. Hameed Using natural language processing (NLP) for designing socially intelligent robots , 2016, 2016 Joint IEEE International Conference on Development and Learning and Epigenetic Robotics (ICDL-EpiRob).

[17]  Kathy Lawrence,et al.  Looking back and moving forward. , 2014, Canadian family physician Medecin de famille canadien.

[18]  Allison B. Dymnicki,et al.  Advancing the Science and Practice of Social and Emotional Learning , 2016 .