Mechanical Ottoman: How Robotic Furniture Offers and Withdraws Support

This paper describes our approach to designing, developing behaviors for, and exploring the use of, a robotic footstool, which we named the mechanical ottoman. By approaching unsuspecting participants and attempting to get them to place their feet on the footstool, and then later attempting to break the engagement and get people to take their feet down, we sought to understand whether and how motion can be used by non-anthropomorphic robots to engage people in joint action. In several embodied design improvisation sessions, we observed a tension between people perceiving the ottoman as a living being, such as a pet, and simultaneously as a functional object, which requests that they place their feet on it—something they would not ordinarily do with a pet. In a follow-up lab study (N=20), we found that most participants did make use of the footstool, although several chose not to place their feet on it for this reason. We also found that participants who rested their feet understood a brief lift and drop movement as a request to withdraw, and formed detailed notions about the footstool’s agenda, ascribing intentions based on its movement alone.

[1]  Ana Paiva,et al.  Social Robots for Long-Term Interaction: A Survey , 2013, International Journal of Social Robotics.

[2]  Alessandro Saffiotti,et al.  Robotic Furniture in a Smart Environment: The PEIS Table , 2009, Intelligent Environments.

[3]  Manfred Tscheligi,et al.  A methodological variation for acceptance evaluation of Human-Robot Interaction in public places , 2008, RO-MAN 2008 - The 17th IEEE International Symposium on Robot and Human Interactive Communication.

[4]  Paul Dourish,et al.  Implications for design , 2006, CHI.

[5]  Herbert H. Clark,et al.  Grounding in communication , 1991, Perspectives on socially shared cognition.

[6]  Clifford Nass,et al.  The media equation - how people treat computers, television, and new media like real people and places , 1996 .

[7]  K. S. Eklundh,et al.  Investigating spatial relationships in human-robot interactions , 2005 .

[8]  Marek P. Michalowski,et al.  A spatial model of engagement for a social robot , 2006, 9th IEEE International Workshop on Advanced Motion Control, 2006..

[9]  Leila Takayama,et al.  "Now, i have a body": uses and social norms for mobile remote presence in the workplace , 2011, CHI.

[10]  Candace L. Sidner,et al.  Explorations in engagement for humans and robots , 2005, Artif. Intell..

[11]  Andrea Lockerd Thomaz,et al.  Effects of nonverbal communication on efficiency and robustness in human-robot teamwork , 2005, 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[12]  Phoebe Sengers,et al.  Staying open to interpretation: engaging multiple meanings in design and evaluation , 2006, DIS '06.

[13]  Cynthia Breazeal,et al.  Toward sociable robots , 2003, Robotics Auton. Syst..

[14]  Alois Knoll,et al.  Human-robot interaction in handing-over tasks , 2008, RO-MAN 2008 - The 17th IEEE International Symposium on Robot and Human Interactive Communication.

[15]  Pierre Dillenbourg,et al.  Living with a Vacuum Cleaning Robot , 2013, Int. J. Soc. Robotics.

[16]  Jodi Forlizzi,et al.  Service robots in the domestic environment: a study of the roomba vacuum in the home , 2006, HRI '06.

[17]  Cynthia Breazeal,et al.  Robots at home: Understanding long-term human-robot interaction , 2008, 2008 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[18]  Akira Ito,et al.  Reactive movements of non-humanoid robots cause intention attribution in humans , 2007, 2007 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[19]  Paul J. Feltovich,et al.  Common Ground and Coordination in Joint Activity , 2005 .

[20]  David SIRKIN,et al.  USING EMBODIED DESIGN IMPROVISATION AS A DESIGN RESEARCH TOOL , 2014 .

[21]  John C. Tang,et al.  Incorporating Human and Machine Interpretation of Unavailability and Rhythm Awareness Into the Design of Collaborative Applications , 2007, Hum. Comput. Interact..

[22]  Sebastian Thrun,et al.  Spontaneous, short-term interaction with mobile robots , 1999, Proceedings 1999 IEEE International Conference on Robotics and Automation (Cat. No.99CH36288C).

[23]  Jodi Forlizzi,et al.  The Snackbot: Documenting the design of a robot for long-term Human-Robot Interaction , 2009, 2009 4th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[24]  Robin R. Murphy,et al.  Emotive Non-Anthropomorphic Robots Perceived as More Calming, Friendly, and Attentive for Victim Management , 2010, AAAI Fall Symposium: Dialog with Robots.

[25]  Francesco Mondada,et al.  Which Robot Behavior Can Motivate Children to Tidy up Their Toys? Design and Evaluation of “Ranger” , 2014, 2014 9th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[26]  Kerstin Dautenhahn,et al.  What is a robot companion - friend, assistant or butler? , 2005, 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[27]  Marek P. Michalowski,et al.  Robots in the wild: observing human-robot social interaction outside the lab , 2006, 9th IEEE International Workshop on Advanced Motion Control, 2006..

[28]  John C. Tang Approaching and leave-taking: Negotiating contact in computer-mediated communication , 2007, TCHI.

[29]  Takayuki Kanda,et al.  Footing in human-robot conversations: How robots might shape participant roles using gaze cues , 2009, 2009 4th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[30]  Chih-Han Yu,et al.  Self-Adaptive Furniture with a Modular Robot , 2007 .

[31]  Rachid Alami,et al.  How may I serve you?: a robot companion approaching a seated person in a helping context , 2006, HRI '06.

[32]  Brian Scassellati,et al.  How to build robots that make friends and influence people , 1999, Proceedings 1999 IEEE/RSJ International Conference on Intelligent Robots and Systems. Human and Environment Friendly Robots with High Intelligence and Emotional Quotients (Cat. No.99CH36289).

[33]  Siddhartha S. Srinivasa,et al.  Using spatial and temporal contrast for fluent robot-human hand-overs , 2011, 2011 6th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[34]  C. Bartneck,et al.  Perception of affect elicited by robot motion , 2010, HRI 2010.