The Emotionally Intelligent Robot: Improving Social Navigation in Crowded Environments

We present a real-time algorithm for emotion-aware navigation of a robot among pedestrians. Our approach estimates time-varying emotional behaviors of pedestrians from their faces and trajectories using a combination of Bayesian-inference, CNN-based learning, and the PAD (Pleasure-Arousal-Dominance) model from psychology. These PAD characteristics are used for long-term path prediction and generating proxemic constraints for each pedestrian. We use a multi-channel model to classify pedestrian characteristics into four emotion categories (happy, sad, angry, neutral). In our validation results, we observe an emotion detection accuracy of 85.33%. We formulate emotion-based proxemic constraints to perform socially-aware robot navigation in low- to medium-density environments. We demonstrate the benefits of our algorithm in simulated environments with tens of pedestrians as well as in a real-world setting with Pepper, a social humanoid robot.

[1]  R. Simmons,et al.  COMPANION: A Constraint-Optimizing Method for Person-Acceptable Navigation , 2009, RO-MAN 2009 - The 18th IEEE International Symposium on Robot and Human Interactive Communication.

[2]  Dinesh Manocha,et al.  GLMP- realtime pedestrian path prediction using global and local movement patterns , 2016, 2016 IEEE International Conference on Robotics and Automation (ICRA).

[3]  J. Russell A circumplex model of affect. , 1980 .

[4]  Rachid Alami,et al.  Human-aware robot navigation: A survey , 2013, Robotics Auton. Syst..

[5]  Ninad Pradhan,et al.  Robot crowd navigation using predictive position fields in the potential function framework , 2011, Proceedings of the 2011 American Control Conference.

[6]  Luis Merino,et al.  Robot local navigation with learned social cost functions , 2014, 2014 11th International Conference on Informatics in Control, Automation and Robotics (ICINCO).

[7]  P. Ekman,et al.  Head and body cues in the judgment of emotion: a reformulation. , 1967, Perceptual and motor skills.

[8]  Rachid Alami,et al.  A framework towards a socially aware Mobile Robot motion in Human-Centered dynamic environment , 2010, 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[9]  U. Castiello,et al.  Cues to intention: The role of movement information , 2011, Cognition.

[10]  Dinesh Manocha,et al.  AdaPT: Real-time adaptive pedestrian tracking for crowded scenes , 2014, 2014 IEEE International Conference on Robotics and Automation (ICRA).

[11]  Y. Coello,et al.  The effect of facial expressions on peripersonal and interpersonal spaces , 2017, Psychological research.

[12]  Sergey Levine,et al.  Uncertainty-Aware Reinforcement Learning for Collision Avoidance , 2017, ArXiv.

[13]  Luis Merino,et al.  Transferring human navigation behaviors into a robot local planner , 2014, The 23rd IEEE International Symposium on Robot and Human Interactive Communication.

[14]  Michael Milford,et al.  Multimodal deep autoencoders for control of a mobile robot , 2015, ICRA 2015.

[15]  Sam J. Maglio,et al.  Emotional category data on images from the international affective picture system , 2005, Behavior research methods.

[16]  P. Vink,et al.  Pleasure, Arousal, Dominance: Mehrabian and Russell revisited , 2014, Current Psychology.

[17]  Jean Oh,et al.  Social Attention: Modeling Attention in Human Crowds , 2017, 2018 IEEE International Conference on Robotics and Automation (ICRA).

[18]  Dinesh Manocha,et al.  SocioSense: Robot navigation amongst pedestrians with social and psychological constraints , 2017, 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS).

[19]  Raymond Chiong,et al.  Deep Learning for Human Affect Recognition: Insights and New Developments , 2019, IEEE Transactions on Affective Computing.

[20]  Karl J. Friston,et al.  Predictive coding: an account of the mirror neuron system , 2007, Cognitive Processing.

[21]  Wolfram Burgard,et al.  Autonomous Robot Navigation in Highly Populated Pedestrian Zones , 2015, J. Field Robotics.

[22]  Kai Oliver Arras,et al.  Learning socially normative robot navigation behaviors with Bayesian inverse reinforcement learning , 2016, 2016 IEEE International Conference on Robotics and Automation (ICRA).

[23]  François Chollet,et al.  Xception: Deep Learning with Depthwise Separable Convolutions , 2016, 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[24]  Amy Loutfi,et al.  Social agent: expressions driven by an electronic nose , 2003, IEEE International Symposium on Virtual Environments, Human-Computer Interfaces and Measurement Systems, 2003. VECIMS '03. 2003.

[25]  Rachid Alami,et al.  A Human Aware Mobile Robot Motion Planner , 2007, IEEE Transactions on Robotics.

[26]  Dinesh Manocha,et al.  Getting Robots Unfrozen and Unlost in Dense Pedestrian Crowds , 2018, IEEE Robotics and Automation Letters.

[27]  Takashi Tsubouchi,et al.  Path-following algorithms and experiments for an unmanned surface vehicle , 2009 .

[28]  Roland Siegwart,et al.  From perception to decision: A data-driven approach to end-to-end motion planning for autonomous ground robots , 2016, 2017 IEEE International Conference on Robotics and Automation (ICRA).

[29]  L. F. Barrett How Emotions Are Made: The Secret Life of the Brain , 2017 .

[30]  Dinesh Manocha,et al.  Generalized velocity obstacles , 2009, 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[31]  Nicholas Roy,et al.  Feature-Based Prediction of Trajectories for Socially Compliant Navigation , 2013 .

[32]  Dinesh Manocha,et al.  Realtime Multilevel Crowd Tracking Using Reciprocal Velocity Obstacles , 2014, 2014 22nd International Conference on Pattern Recognition.

[33]  S A Black,et al.  Emotional Well‐Being Predicts Subsequent Functional Independence and Survival , 2000, Journal of the American Geriatrics Society.

[34]  Julien Diard,et al.  Proxemics models for human-aware navigation in robotics: Grounding interaction and personal space models in experimental data from psychology , 2014 .

[35]  F. Pollick,et al.  A motion capture library for the study of identity, gender, and emotion perception from biological motion , 2006, Behavior research methods.

[36]  Christian Laugier,et al.  Dynamic Obstacle Avoidance in uncertain environment combining PVOs and Occupancy Grid , 2007, Proceedings 2007 IEEE International Conference on Robotics and Automation.

[37]  Dinesh Manocha,et al.  CrowdMove: Autonomous Mapless Navigation in Crowded Scenarios , 2018, ArXiv.

[38]  Dinesh Manocha,et al.  BRVO: Predicting pedestrian trajectories using velocity-space reasoning , 2015, Int. J. Robotics Res..

[39]  Claire L. Roether,et al.  Critical features for the perception of emotion from gait. , 2009, Journal of vision.

[40]  Tingting Xu,et al.  The Autonomous City Explorer: Towards Natural Human-Robot Interaction in Urban Environments , 2009, Int. J. Soc. Robotics.

[41]  Dinesh Manocha,et al.  Real-time Crowd Tracking using Parameter Optimized Mixture of Motion Models , 2014, ArXiv.

[42]  Ping Liu,et al.  Facial Expression Recognition via a Boosted Deep Belief Network , 2014, 2014 IEEE Conference on Computer Vision and Pattern Recognition.

[43]  Habib Dhahri,et al.  Human Expertise in Mobile Robot Navigation , 2018, IEEE Access.

[44]  Dinesh Manocha,et al.  Reciprocal n-Body Collision Avoidance , 2011, ISRR.

[45]  Kai Oliver Arras,et al.  Socially-aware robot navigation: A learning approach , 2012, 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[46]  Andreas Krause,et al.  Robot navigation in dense human crowds: the case for cooperation , 2013, 2013 IEEE International Conference on Robotics and Automation.

[47]  Martial Hebert,et al.  Learning monocular reactive UAV control in cluttered natural environments , 2012, 2013 IEEE International Conference on Robotics and Automation.

[48]  Gonzalo Ferrer,et al.  Robot companion: A social-force based approach with human awareness-navigation in crowded environments , 2013, 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[49]  Yoshua Bengio,et al.  Challenges in representation learning: A report on three machine learning contests , 2013, Neural Networks.

[50]  Marcel Zeelenberg,et al.  On emotion specificity in decision making: Why feeling is for doing , 2008, Judgment and Decision Making.

[51]  Joel W. Burdick,et al.  Robotic motion planning in dynamic, cluttered, uncertain environments , 2010, 2010 IEEE International Conference on Robotics and Automation.

[52]  Àgata Lapedriza,et al.  Emotion Recognition in Context , 2017, 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).

[53]  Aleix M. Martínez,et al.  EmotioNet: An Accurate, Real-Time Algorithm for the Automatic Annotation of a Million Facial Expressions in the Wild , 2016, 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR).