Building Blocks of Social Intelligence: Enabling Autonomy for Socially Intelligent and Assistive Robots

We present an overview of the control, recognition, decision-making, and learning techniques utilized by the Interaction Lab (robotics.usc.edu/interaction) at the University of Southern California (USC) to enable autonomy in sociable and socially assistive robots. These techniques are implemented with two software libraries: 1) the Social Behavior Library (SBL) provides autonomous social behavior controllers; and 2) the Social Interaction Manager (SIM) provides probabilistic models to recognize, reason over, and learn about human behavior. Both libraries are implemented in the Robot Operating System (ROS; www.ros.org) framework, and are made available to the community as open-source software packages in the USC ROS Package Repository (code.google.com/p/uscinteraction-software).

[1]  Ralph Arnote,et al.  Hong Kong (China) , 1996, OECD/G20 Base Erosion and Profit Shifting Project.

[2]  Maja J. Mataric,et al.  An architecture for rehabilitation task practice in socially assistive human-robot interaction , 2010, 19th International Symposium in Robot and Human Interactive Communication.

[3]  Maja J. Mataric,et al.  Speech, Gesture, and Space: Investigating Explicit and Implicit Communication in Multi-Human Multi-Robot Collaborations , 2011, AAAI Spring Symposium: Multirobot Systems and Physical Data Structures.

[4]  Louis-Philippe Morency,et al.  A probabilistic multimodal approach for predicting listener backchannels , 2009, Autonomous Agents and Multi-Agent Systems.

[5]  Crystal Chao,et al.  Controlling social dynamics with a parametrized model of floor regulation , 2013, HRI 2013.

[6]  Takayuki Kanda,et al.  Conversational gaze mechanisms for humanlike robots , 2012, TIIS.

[7]  Maja J. Mataric,et al.  Perceptual Models of Human-Robot Proxemics , 2014, ISER.

[8]  Ross Mead,et al.  Introducing students grades 6–12 to expressive robotics , 2012, 2012 7th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[9]  Maja J. Mataric,et al.  Role-based coordinating communication for effective human-robot task collaborations , 2013, 2013 International Conference on Collaboration Technologies and Systems (CTS).

[10]  Maja J. Mataric,et al.  Recognition of Physiological Data for a Motivational Agent , 2011, AAAI Spring Symposium: Computational Physiology.

[11]  Maja J. Mataric,et al.  Investigating the effects of visual saliency on deictic gesture production by a humanoid robot , 2011, 2011 RO-MAN.

[12]  Maja J. Mataric,et al.  Automated Proxemic Feature Extraction and Behavior Recognition: Applications in Human-Robot Interaction , 2013, Int. J. Soc. Robotics.

[13]  Maja J. Mataric,et al.  Interpreting instruction sequences in spatial language discourse with pragmatics towards natural human-robot interaction , 2014, 2014 IEEE International Conference on Robotics and Automation (ICRA).

[14]  Maja J. Mataric,et al.  Graded cueing feedback in robot-mediated imitation practice for children with autism spectrum disorders , 2014, The 23rd IEEE International Symposium on Robot and Human Interactive Communication.

[15]  Maja J. Matarić,et al.  Integrating ROS into Educational Robotics: Bridging the Gap between Grade School and Grad School , 2013 .

[16]  Laurent Itti,et al.  Realistic avatar eye and head animation using a neurobiological model of visual attention , 2004, SPIE Optics + Photonics.