The task selection mechanism for interactive robots: Application to the intelligent life supporting system

The essential challenge in the future ubiquitous networks is to make information available to people not only at any time, at any place, and in any form, but with the right thing at the right time in the right way by inferring the users' situations. Several psychological experiments show that there are some associations between each user's situations including the user's emotions and each user's task selection. Utilizing those results, this article presents a situation‐based task selection mechanism that enables a life‐supporting robot system to perform tasks based on the user's situation. Stimulated by interactions between the robot and the user, this mechanism constructs and updates the association between the user's situation and tasks so that the robot can adapt to the user's behaviors related to the robot's tasks effectively. For the user adaptation, Radial Basis Function Networks (RBFNs) and associative learning algorithms are used. The proposed mechanism is applied to the CRF3 (Character robot face 3) system to prove its feasibility and effectiveness. © 2006 Wiley Periodicals, Inc. Int J Int Syst 21: 973–1004, 2006.

[1]  Gregory D. Abowd,et al.  Towards a Better Understanding of Context and Context-Awareness , 1999, HUC.

[2]  A. Takanishi,et al.  Development of a new human-like head robot WE-4 , 2002, IEEE/RSJ International Conference on Intelligent Robots and Systems.

[3]  Wolfram Burgard,et al.  Experiences with an Interactive Museum Tour-Guide Robot , 1999, Artif. Intell..

[4]  Satoru Hayamizu,et al.  Socially Embedded Learning of the Office-Conversant Mobil Robot Jijo-2 , 1997, IJCAI.

[5]  Bill N. Schilit,et al.  An overview of the PARCTAB ubiquitous computing experiment , 1995, IEEE Wirel. Commun..

[6]  Carl F. R. Weiman,et al.  Helpmate autonomous mobile robot nav-igation system , 1991 .

[7]  Pamela J. Hinds,et al.  Whose job is it anyway? a study of human-robot interaction in a collaborative task , 2004 .

[8]  Bill N. Schilit,et al.  Context-aware computing applications , 1994, Workshop on Mobile Computing Systems and Applications.

[9]  Martha E. Pollack,et al.  Autominder: an intelligent cognitive orthotic system for people with memory impairment , 2003, Robotics Auton. Syst..

[10]  Takanori Shibata,et al.  Physical and affective interaction between human and mental commit robot , 2001, Proceedings 2001 ICRA. IEEE International Conference on Robotics and Automation (Cat. No.01CH37164).

[11]  Kohki Kikuchi,et al.  Development on face robot for real facial expressions , 2001, Proceedings 2001 IEEE/RSJ International Conference on Intelligent Robots and Systems. Expanding the Societal Role of Robotics in the the Next Millennium (Cat. No.01CH37180).

[12]  H. Kozima Infanoid: A Babybot that Explores the Social Environment , 2002 .

[13]  John Moody,et al.  Fast Learning in Networks of Locally-Tuned Processing Units , 1989, Neural Computation.

[14]  Takayuki Kanda,et al.  A practical experiment with interactive humanoid robots in a human society , 2003 .

[15]  Sorin Moga,et al.  Learning and communication via imitation: an autonomous robot perspective , 2001, IEEE Trans. Syst. Man Cybern. Part A.

[16]  Amitava Mukherjee,et al.  Pervasive Computing: A Paradigm for the 21st Century , 2003, Computer.

[17]  Yasuhisa Hasegawa,et al.  Facial expressive robotic head system for human-robot communication and its application in home environment , 2004, Proceedings of the IEEE.

[18]  W. Keith Edwards,et al.  At Home with Ubiquitous Computing: Seven Challenges , 2001, UbiComp.

[19]  Gregory D. Abowd,et al.  The Aware Home: A Living Laboratory for Ubiquitous Computing Research , 1999, CoBuild.

[20]  Monica N. Nicolescu,et al.  Learning and interacting in human-robot domains , 2001, IEEE Trans. Syst. Man Cybern. Part A.

[21]  Hiroaki Kitano,et al.  Human-robot interaction through real-time auditory and visual multiple-talker tracking , 2001, Proceedings 2001 IEEE/RSJ International Conference on Intelligent Robots and Systems. Expanding the Societal Role of Robotics in the the Next Millennium (Cat. No.01CH37180).

[22]  Stefan Schaal,et al.  Is imitation learning the route to humanoid robots? , 1999, Trends in Cognitive Sciences.

[23]  Mahadev Satyanarayanan,et al.  Pervasive computing: vision and challenges , 2001, IEEE Wirel. Commun..

[24]  Wendelin Feiten,et al.  Field test of a navigation system: autonomous cleaning in supermarkets , 1998, Proceedings. 1998 IEEE International Conference on Robotics and Automation (Cat. No.98CH36146).

[25]  Stefan Schaal,et al.  Robot Learning From Demonstration , 1997, ICML.

[26]  Cynthia Breazeal,et al.  Emotion and sociable humanoid robots , 2003, Int. J. Hum. Comput. Stud..

[27]  Minoru Asada,et al.  Joint attention emerges through bootstrap learning , 2003, Proceedings 2003 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2003) (Cat. No.03CH37453).

[28]  Anind K. Dey,et al.  Understanding and Using Context , 2001, Personal and Ubiquitous Computing.

[29]  Yasuhisa Hasegawa,et al.  Mood and task coordination of home robots , 2003, 2003 IEEE International Conference on Robotics and Automation (Cat. No.03CH37422).

[30]  Masayuki Inaba,et al.  Learning by watching: extracting reusable task knowledge from visual observation of human performance , 1994, IEEE Trans. Robotics Autom..

[31]  Andy Hopper,et al.  The active badge location system , 1992, TOIS.