Non-facial/non-verbal methods of affective expression as applied to robot-assisted victim assessment

This work applies a previously developed set of heuristics for determining when to use non-facial/non-verbal methods of affective expression to the domain of a robot being used for victim assessment in the aftermath of a disaster. Robot-assisted victim assessment places a robot approximately three meters or less from a victim, and the path of the robot traverses three proximity zones (intimate (contact - 0.46 m), personal (0.46 - 1.22 m), and social (1.22 - 3.66 m)). Robot- and victim-eye views of an Inuktun robot were collected as it followed a path around the victim. The path was derived from observations of a prior robot-assisted medical reachback study. The victim's-eye views of the robot from seven points of interest on the path illustrate the appropriateness of each of the five primary non-facial/non-verbal methods of affective expression: (body movement, posture, orientation, illuminated color, and sound), offering support for the heuristics as a design aid. In addition to supporting the heuristics, the investigation identified three open research questions on acceptable motions and impact of the surroundings on robot affect.

[1]  Ronald C. Arkin,et al.  Human perspective on affective robotic behavior: a longitudinal study , 2005, 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[2]  Yoichiro Maeda Fuzzy rule expression for emotional generation model based on subsumption architecture , 1999, 18th International Conference of the North American Fuzzy Information Processing Society - NAFIPS (Cat. No.99TH8397).

[3]  M. Seif El-Nasr,et al.  A fuzzy emotional agent for decision-making in a mobile robot , 1998, 1998 IEEE International Conference on Fuzzy Systems Proceedings. IEEE World Congress on Computational Intelligence (Cat. No.98CH36228).

[4]  R.R. Murphy,et al.  Robot-assisted medical reachback: a survey of how medical personnel expect to interact with rescue robots , 2004, RO-MAN 2004. 13th IEEE International Workshop on Robot and Human Interactive Communication (IEEE Catalog No.04TH8759).

[5]  David Lee,et al.  The influence of subjects' personality traits on personal spatial zones in a human-robot interaction experiment , 2005, ROMAN 2005. IEEE International Workshop on Robot and Human Interactive Communication, 2005..

[6]  Tetsuo Sawaragi,et al.  Acquiring communicative motor acts of social robot using interactive evolutionary computation , 2001, 2001 IEEE International Conference on Systems, Man and Cybernetics. e-Systems and e-Man for Cybernetics in Cyberspace (Cat.No.01CH37236).

[7]  Thomas Fincannon,et al.  Evidence of the need for social intelligence in rescue robots , 2004, 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566).

[8]  Vasant Honavar,et al.  Aurally Informed Performance: Integrating Machine Listening and Auditory Presentation in Robotic Systems, Papers from the 2006 AAAI Fall Symposium, Washington, DC, USA, October 13-15, 2006 , 2006, AAAI Fall Symposium: Aurally Informed Performance.

[9]  M. Argyle Bodily communication, 2nd ed. , 1988 .

[10]  François Michaud,et al.  Perspectives on Mobile Robots as Tools for Child Development and Pediatric Rehabilitation , 2007, Assistive technology : the official journal of RESNA.

[11]  Rachid Alami,et al.  How may I serve you?: a robot companion approaching a seated person in a helping context , 2006, HRI '06.

[12]  Kerstin Dautenhahn,et al.  From embodied to socially embedded agents – Implications for interaction-aware robots , 2002, Cognitive Systems Research.

[13]  Henrik I. Christensen,et al.  Human-robot embodied interaction in hallway settings: a pilot user study , 2005, ROMAN 2005. IEEE International Workshop on Robot and Human Interactive Communication, 2005..

[14]  Robin R. Murphy,et al.  Survey of Non-facial/Non-verbal Affective Expressions for Appearance-Constrained Robots , 2008, IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews).

[15]  Henrik I. Christensen,et al.  Evaluation of Passing Distance for Social Robots , 2006, ROMAN 2006 - The 15th IEEE International Symposium on Robot and Human Interactive Communication.

[16]  Robin R. Murphy,et al.  Auditory and Other Non-verbal Expressions of Affect for Robots , 2006, AAAI Fall Symposium: Aurally Informed Performance.

[17]  Hiroshi Mizoguchi,et al.  Realization of Expressive Mobile Robot , 1997, Proceedings of International Conference on Robotics and Automation.

[18]  Cynthia Breazeal,et al.  Designing sociable robots , 2002 .

[19]  Robin R. Murphy,et al.  Affective expression in appearance constrained robots , 2006, HRI '06.

[20]  Timothy W. Bickmore,et al.  Towards caring machines , 2004, CHI EA '04.

[21]  Scott S. Snibbe,et al.  Experiences with Sparky, a Social Robot , 2002 .

[22]  Robin R. Murphy,et al.  Robot-assisted medical reachback: using shared visual information , 2005, ROMAN 2005. IEEE International Workshop on Robot and Human Interactive Communication, 2005..