The RUBI project: A progress report

The goal of the RUBI project is to accelerate progress in the development of social robots by addressing the problem at multiple levels, including the development of a scientific agenda, research methods, formal approaches, software, and hardware. The project is based on the idea that progress will go hand-in-hand with the emergence of a new scientific discipline that focuses on understanding the organization of adaptive behavior in real-time within the environments in which organisms operate. As such, the RUBI project emphasizes the process of design by immersion, i.e., embedding scientists, engineers and robots in everyday life environments so as to have these environments shape the hardware, software, and scientific questions as early as possible in the development process. The focus of the project so far has been on social robots that interact with 18 to 24 month old toddlers as part of their daily activities at the Early Childhood Education Center at the University of California, San Diego. In this document we present an overall assessment of the lessons and progress through year two of the project.

[1]  Tetsuharu Fukushima,et al.  A small biped entertainment robot exploring human-robot interactive applications , 2003, The 12th IEEE International Workshop on Robot and Human Interactive Communication, 2003. Proceedings. ROMAN 2003..

[2]  Ian R. Fasel,et al.  Learning real-time object detectors : probabilistic generative approaches , 2006 .

[3]  Cynthia Breazeal,et al.  Designing sociable robots , 2002 .

[4]  Phil Howlett,et al.  Stochastic Optimal Control of a Solar Car , 2001 .

[5]  Gwen Littlewort,et al.  Recognizing facial expression: machine learning and application to spontaneous behavior , 2005, 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05).

[6]  Javier R. Movellan,et al.  Developing dance interaction between QRIO and toddlers in a classroom environment: plans for the first steps , 2005, ROMAN 2005. IEEE International Workshop on Robot and Human Interactive Communication, 2005..

[7]  Refractor Vision , 2000, The Lancet.

[8]  Tetsuo Ono,et al.  Development and evaluation of an interactive humanoid robot "Robovie" , 2002, Proceedings 2002 IEEE International Conference on Robotics and Automation (Cat. No.02CH37292).

[9]  K. Doya,et al.  A unifying computational framework for motor control and social interaction. , 2003, Philosophical transactions of the Royal Society of London. Series B, Biological sciences.

[10]  Gwen Littlewort,et al.  Automatic Recognition of Facial Actions in Spontaneous Expressions , 2006, J. Multim..

[11]  Brian Scassellati,et al.  Foundations for a theory of mind for a humanoid robot , 2001 .

[12]  Minoru Asada,et al.  Learning for joint attention helped by functional development , 2006, Adv. Robotics.

[13]  Kerstin Dautenhahn,et al.  Design issues on interactive environments for children with autism , 2000 .

[14]  J. Movellan,et al.  The development of gaze following as a Bayesian systems identification problem , 2002, Proceedings 2nd International Conference on Development and Learning. ICDL 2002.

[15]  Mike Hess,et al.  QLikingQ Through Moment-To-Moment Evaluation; Identifying Key Selling Segments in Advertising , 1991 .

[16]  Takayuki Kanda,et al.  Interactive Robots as Social Partners and Peer Tutors for Children: A Field Trial , 2004, Hum. Comput. Interact..

[17]  Javier R. Movellan,et al.  Perception of Directional Attention , 2001 .

[18]  James L. McClelland,et al.  Autonomous Mental Development by Robots and Animals , 2001, Science.

[19]  J.R. Movellan,et al.  Plans for Developing Real-time Dance Interaction between QRIO and Toddlers in a Classroom Environment , 2005, Proceedings. The 4nd International Conference on Development and Learning, 2005..

[20]  Gwen Littlewort,et al.  Dynamics of Facial Expression Extracted Automatically from Video , 2004, 2004 Conference on Computer Vision and Pattern Recognition Workshop.

[21]  R. Brooks,et al.  The cog project: building a humanoid robot , 1999 .

[22]  J. Movellan,et al.  Human and computer recognition of facial expressions of emotion , 2007, Neuropsychologia.

[23]  Fumihide Tanaka,et al.  Socialization between toddlers and robots at an early childhood education center , 2007, Proceedings of the National Academy of Sciences.

[24]  Illah R. Nourbakhsh,et al.  A survey of socially interactive robots , 2003, Robotics Auton. Syst..

[25]  J.R. Movellan,et al.  An Infomax Controller for Real Time Detection of Social Contingency , 2005, Proceedings. The 4nd International Conference on Development and Learning, 2005..

[26]  Hideki Kozima,et al.  Interactive robots for communication-care: a case-study in autism therapy , 2005, ROMAN 2005. IEEE International Workshop on Robot and Human Interactive Communication, 2005..

[27]  E. Vesterinen,et al.  Affective Computing , 2009, Encyclopedia of Biometrics.

[28]  Fumihide Tanaka,et al.  Daily HRI evaluation at a classroom environment: reports from dance interaction experiments , 2006, HRI '06.

[29]  Ronald A. Cole,et al.  Perceptive animated interfaces: first steps toward a new paradigm for human-computer interaction , 2003, Proc. IEEE.

[30]  Hiroshi Ishiguro,et al.  Development of an android robot for studying human-robot interaction , 2004 .

[31]  Dimitri P. Bertsekas,et al.  Stochastic optimal control : the discrete time case , 2007 .

[32]  S. Fiske,et al.  Social Psychology , 2019, Encyclopedia of Personality and Individual Differences.

[33]  Gwen Littlewort,et al.  Fully Automatic Facial Action Recognition in Spontaneous Behavior , 2006, 7th International Conference on Automatic Face and Gesture Recognition (FGR06).

[34]  J. Movellan,et al.  Ruby: A Robotic Platform for Real-time Social Interaction , 2004 .

[35]  M. K rn,et al.  Stochastic Optimal Control , 1988 .

[36]  Alex Pentland Socially Aware Computation and Communication , 2005, Computer.

[37]  J.R. Movellan,et al.  The RUBI/QRIO Project: Origins, Principles, and First Steps , 2005, Proceedings. The 4nd International Conference on Development and Learning, 2005..

[38]  F. Tanaka,et al.  Dance interaction with QRIO: a case study for non-boring interaction by using an entrainment ensemble model , 2004, RO-MAN 2004. 13th IEEE International Workshop on Robot and Human Interactive Communication (IEEE Catalog No.04TH8759).

[39]  Batya Friedman,et al.  Robotic pets in the lives of preschool children , 2004, CHI EA '04.

[40]  Ian R. Fasel,et al.  Face-to-face interactive humanoid robot , 2004, 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566).

[41]  Gwen Littlewort,et al.  Machine learning methods for fully automatic recognition of facial expressions and facial actions , 2004, 2004 IEEE International Conference on Systems, Man and Cybernetics (IEEE Cat. No.04CH37583).

[42]  Aude Billard,et al.  Robotic assistants in therapy and education of children with autism: can a small humanoid robot help encourage social interaction skills? , 2005, Universal Access in the Information Society.

[43]  Lorraine E. Bahrick,et al.  Detection of Intermodal Proprioceptive-Visual Contingency as a , Potential Basis of Self-Perception in Infancy , 1985 .

[44]  T. Kuhn,et al.  The Structure of Scientific Revolutions. , 1964 .

[45]  Yoshihiro Kuroki,et al.  Development of mechanical system for a small biped entertainment robot , 2003, The 12th IEEE International Workshop on Robot and Human Interactive Communication, 2003. Proceedings. ROMAN 2003..