Transitional Explainer: Instruct Functions in the Real World and Onscreen in Multi-Function Printer

An office appliance, such as a multi-function printer (MFP) that combines a printer, copy machine, scanner, and facsimile, requires users to simultaneously learn both the manipulation of real-world objects and their abstract representation in a virtual world. Although many MFPs are provided in most offices and stores, their services are often not fully utilized because users deem their features too difficult to understand. We therefore propose a 'transitional explainer,' an agent that instructs users about the features of MFPs by mixing real- and virtual-world representations. Blended reality has been proposed as part of augmented reality. It involves the blending of virtual and real expressions to leverage their combined advantages. In this study, we utilize the advantages of blended reality to show users how to operate complex appliances by extending anthropomorphized explanations. A self-explanatory style is used by the appliance itself to improve the user's ability to remember features and enhance the motivation of all users, especially older ones, to learn the functions. With this system, users interact with the transitional agent and thereby learn how to use the MFP. The agent hides its real eyes and arms in onscreen mode, and extends them in real-world mode. We implemented the transitional explainer for realizing the blended reality agent in the MFP. In addition, we evaluated how this transitional expression supports users' understanding of how to manipulate the MFP and enhances users' motivation to use it.

[1]  Horst-Michael Groß,et al.  TOOMAS: Interactive Shopping Guide robots in everyday use - final implementation and experiences from long-term field trials , 2009, 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[2]  Cynthia Breazeal,et al.  Effect of a robot on user perceptions , 2004, 2004 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE Cat. No.04CH37566).

[3]  Rui Prada,et al.  Pervasive Pleo : Long-term Attachment with Artificial Pets , 2010 .

[4]  Hirotaka Osawa,et al.  Using Attachable Humanoid Parts for Realizing Imaginary Intention and Body Image , 2009, Int. J. Soc. Robotics.

[5]  Hirotaka Osawa,et al.  Morphing agency: deconstruction of an agent with transformative agential triggers , 2013, CHI Extended Abstracts.

[6]  Yoshio Yamamoto,et al.  Pursuing entertainment aspects of SONY AIBO quadruped robots , 2011, 2011 Fourth International Conference on Modeling, Simulation and Applied Optimization.

[7]  Tetsuo Ono,et al.  ITACO: Effects to Interactions by Relationships between Humans and Artifacts , 2008, IVA.

[8]  Henriette Cramer,et al.  Please enjoy!: workshop on playful experiences in mobile HCI , 2010, Mobile HCI.

[9]  Lucy Suchman,et al.  Human-Machine Reconfigurations: Plans and Situated Actions , 2006 .

[10]  Takanori Shibata,et al.  Subjective Evaluation of Seal Robot: Paro -Tabulation and Analysis of Questionnaire Results , 2002, J. Robotics Mechatronics.

[11]  Susanne van Mulken,et al.  The impact of animated interface agents: a review of empirical research , 2000, Int. J. Hum. Comput. Stud..

[12]  Brian Scassellati,et al.  The Benefits of Interactions with Physically Present Robots over Video-Displayed Agents , 2011, Int. J. Soc. Robotics.

[13]  Ehud Sharlin,et al.  Ningyo of the CAVE: robots as social puppets of static infrastructure , 2014, HAI.

[14]  Hirotaka Osawa,et al.  BReA: Potentials of combining reality and virtual communications using a blended reality agent , 2013, 2013 IEEE RO-MAN.

[15]  Kerstin Dautenhahn,et al.  Can Social Interaction Skills Be Taught by a Social Agent? The Role of a Robotic Mediator in Autism Therapy , 2001, Cognitive Technology.

[16]  Futoshi Naya,et al.  Differences in effect of robot and screen agent recommendations on human decision-making , 2005, Int. J. Hum. Comput. Stud..

[17]  Tetsuo Ono,et al.  Cooperative embodied communication emerged by interactive humanoid robots , 2004, RO-MAN 2004. 13th IEEE International Workshop on Robot and Human Interactive Communication (IEEE Catalog No.04TH8759).

[18]  Cynthia Breazeal,et al.  Blended reality characters , 2012, 2012 7th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[19]  Hirotaka Osawa,et al.  Maintaining learning motivation of older people by combining household appliance with a communication robot , 2010, 2010 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[20]  Yan Xu,et al.  Exploring user experience in "blended reality": moving interactions out of the screen , 2006, CHI EA '06.

[21]  Justine Cassell,et al.  Embodied conversational interface agents , 2000, CACM.