A socially-intelligent multi-robot service team for in-home monitoring

The objective of this study is to develop a socially-intelligent service team comprised of multiple robots with sophisticated sonic interaction capabilities that aims to transparently collaborate towards efficient and robust monitoring by close interaction. In the distributed scenario proposed in this study, the robots share any acoustic data extracted from the environment and act in-sync with the events occurring in their living environment in order to provide potential means for efficient monitoring and decision-making within a typical home enclosure. Although each robot acts as an individual recognizer using a novel emotionally-enriched word recognition system, the final decision is social in nature and is followed by all. Moreover, the social decision stage triggers actions that are algorithmically distributed among the robots' population and enhances the overall approach with the potential advantages of the team work within specific communities through collaboration.

[1]  Xinyu Wu,et al.  Catering service robot , 2010, 2010 8th World Congress on Intelligent Control and Automation.

[2]  Roland Siegwart,et al.  A navigation framework for multiple mobile robots and its application at the Expo.02 exhibition , 2003, 2003 IEEE International Conference on Robotics and Automation (Cat. No.03CH37422).

[3]  Andreas Floros,et al.  Emergency Voice/Stress-Level Combined Recognition for Intelligent House Applications , 2012 .

[4]  Ren C. Luo,et al.  Enriched Indoor Map Construction Based on Multisensor Fusion Approach for Intelligent Service Robot , 2012, IEEE Transactions on Industrial Electronics.

[5]  Mahmoud Tarokh,et al.  Erratum to: Fuzzy logic decision making for multi-robot security systems , 2010, Artificial Intelligence Review.

[6]  G. Nejat,et al.  Promoting engagement in cognitively stimulating activities using an intelligent socially assistive robot , 2010, 2010 IEEE/ASME International Conference on Advanced Intelligent Mechatronics.

[7]  Christine L. Lisetti,et al.  A social informatics approach to human-robot interaction with a service social robot , 2004, IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews).

[8]  R. C. Luo,et al.  Combined 2-D sound source localization with stereo vision for intelligent Human-Robot Interaction of service robot , 2009, 2009 IEEE Workshop on Advanced Robotics and its Social Impacts.

[9]  Shubhangi D. Giripunje,et al.  ANFIS Based Emotions Recognision in Speech , 2007, KES.

[10]  Mark A. Neerincx,et al.  Incorporating guidelines for health assistance into a socially intelligent robot , 2006, ROMAN 2006 - The 15th IEEE International Symposium on Robot and Human Interactive Communication.

[11]  C. S. George Lee,et al.  Real-time emotion identification for socially intelligent robots , 2012, 2012 IEEE International Conference on Robotics and Automation.

[12]  K. Scherer,et al.  Acoustic profiles in vocal emotion expression. , 1996, Journal of personality and social psychology.

[13]  Hye-Jin Min,et al.  Make Your wishes to ‘genie in the lamp’: Physical push with a socially intelligent robot , 2011, 2011 6th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[14]  K. Scherer,et al.  Vocal expression of affect , 2005 .

[15]  Chih-Chia Chang,et al.  Multisensor Fusion and Integration Aspects of Mechatronics , 2010, IEEE Industrial Electronics Magazine.

[16]  Horst-Michael Groß,et al.  Progress in developing a socially assistive mobile home robot companion for the elderly with mild cognitive impairment , 2011, 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[17]  Agnieszka Wykowska,et al.  Perception and Action as Two Sides of the Same Coin. A Review of the Importance of Action-Perception Links in Humans for Social Robot Design and Research , 2012, Int. J. Soc. Robotics.

[18]  Junqing Yu,et al.  An improved valence-arousal emotion space for video affective content representation and recognition , 2009, 2009 IEEE International Conference on Multimedia and Expo.

[19]  Georgios C. Anagnostopoulos,et al.  Knowledge-Based Intelligent Information and Engineering Systems , 2003, Lecture Notes in Computer Science.

[20]  Mahmoud Neji,et al.  The Affective Tutoring System , 2010, Expert Syst. Appl..

[21]  Stelios M. Potirakis,et al.  Time-Domain Nonlinear Modeling of Practical Electroacoustic Transducers , 1999 .

[22]  K. Scherer,et al.  The New Handbook of Methods in Nonverbal Behavior Research , 2008 .

[23]  Johannes Oberzaucher,et al.  Evaluation of Human Robot Interaction Factors of a Socially Assistive Robot Together with Older People , 2012, 2012 Sixth International Conference on Complex, Intelligent, and Software Intensive Systems.

[24]  Mahmoud Tarokh,et al.  Fuzzy logic decision making for multi-robot security systems , 2010, Artificial Intelligence Review.

[25]  Birgit Graf,et al.  Service Robots and Automation for the Disabled/Limited , 2009, Handbook of Automation.

[26]  T. Johnstone,et al.  The voice of emotion: an FMRI study of neural responses to angry and happy vocal expressions. , 2006, Social cognitive and affective neuroscience.

[27]  Klaus R. Scherer,et al.  Vocal Affect Signaling: A Comparative Approach , 1985 .