Multimodal Robot Feedback for Eldercare *

The Robot-Era project’s goal is to enhance the quality of life and independence of elderly people through robotic assistance. When interacting with users in the home, outdoors, or in assisted living facilities, the interpretability of a robot’s behaviour is critical to its usability and acceptance. Well-designed feedback from the robot is necessary both when a robot is acting autonomously and when it is being controlled by a human user. In this position paper we describe our approach to the design of multimodal feedback for the Robot-Era project, including handheld devices as well as the robot itself. We will also discuss some of the unique challenges present in designing feedback for an elderly user population.

[1]  Filippo Cavallo,et al.  AALIANCE Ambient Assisted Living Roadmap , 2010, Ambient Intelligence and Smart Environments.

[2]  Andrea Lockerd Thomaz,et al.  Effects of nonverbal communication on efficiency and robustness in human-robot teamwork , 2005, 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[3]  Mark E. McMurtrey,et al.  Seniors and information technology: A potential goldmine of opportunity? , 2009 .

[4]  João Sequeira,et al.  Measuring Motion Expressiveness in Wheeled Mobile Robots , 2008, EUROS.

[5]  Bilge Mutlu,et al.  Human-robot proxemics: Physical and psychological distancing in human-robot interaction , 2011, 2011 6th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[6]  Kerstin Dautenhahn,et al.  A long-term Human-Robot Proxemic study , 2011, 2011 RO-MAN.

[7]  Jenay M. Beer,et al.  The domesticated robot: Design guidelines for assisting older adults to age in place , 2012, 2012 7th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[8]  S. Tipper,et al.  Gaze cueing of attention: visual attention, social cognition, and individual differences. , 2007, Psychological bulletin.

[9]  A. Patla,et al.  “Look where you’re going!”: gaze behaviour associated with maintaining and changing the direction of locomotion , 2002, Experimental Brain Research.

[10]  Kristinn R. Thórisson,et al.  The Power of a Nod and a Glance: Envelope Vs. Emotional Feedback in Animated Conversational Agents , 1999, Appl. Artif. Intell..

[11]  J. Hietanen,et al.  I'll Walk This Way: Eyes Reveal the Direction of Locomotion and Make Passersby Look and Go the Other Way , 2009, Psychological science.

[12]  Britta Wrede,et al.  Why and How to Model Multi-Modal Interaction for a Mobile Robot Companion , 2007, Interaction Challenges for Intelligent Assistants.

[13]  C. Frith,et al.  “Hey John”: Signals Conveying Communicative Intention toward the Self Activate Brain Regions Associated with “Mentalizing,” Regardless of Modality , 2003, The Journal of Neuroscience.