Third Eye: Exploring the Affordances of Third-Person View in Telepresence Robots

Social interaction through telepresence robots can be challenging for a robot operator due to lack of spatial awareness caused by limited idiothetic cues and narrow field of view of a robot’s camera. We explore the use of a third-person perspective, popular in video game design, to provide missing spatial cues to remote robot operators. We present the design and implementation of Third Eye, a system that enables controlling telepresence robots through a third-person view. Third Eye comprises a controllable third-person camera with a wide field of view, attached to a robot, and bimanual controls for remote operation. Observations from a user study show that Third Eye enabled the robot operators to have a better awareness of the robot ‘bodies’ they controlled. This, in turn, afforded new behavior for operators. In addition, the camera design supported ecologically valid interaction for social telepresence. Quantitative data shows that Third Eye has comparable navigation efficiency to existing systems.

[1]  David R. Scribner,et al.  The Effect of Stereoscopic and Wide Field of View Conditions on Teleoperator Performance , 1998 .

[2]  Takeo Igarashi,et al.  A teleoperating interface for ground vehicles using autonomous flying cameras , 2013, 2013 23rd International Conference on Artificial Reality and Telexistence (ICAT).

[3]  Paul P. Maglio,et al.  On Distinguishing Epistemic from Pragmatic Action , 1994, Cogn. Sci..

[4]  Nicholas A. Giudice,et al.  Learning with Virtual Verbal Displays: Effects of Interface Fidelity on Cognitive Map Development , 2008, Spatial Cognition.

[5]  Michael Lewis,et al.  Camera control and decoupled motion for teleoperation , 2003, SMC'03 Conference Proceedings. 2003 IEEE International Conference on Systems, Man and Cybernetics. Conference Theme - System Security and Assurance (Cat. No.03CH37483).

[6]  Bilge Mutlu,et al.  Can You See Me Now?: How Field of View Affects Collaboration in Robotic Telepresence , 2015, CHI.

[7]  Adam Jacoff,et al.  RoboCup 2004 Competitions and Symposium: A Small Kick for Robots, a Giant Score for Science , 2005, AI Mag..

[8]  Hideaki Kuzuoka,et al.  Fractured Ecologies: Creating Environments for Collaboration , 2003, Hum. Comput. Interact..

[9]  Jukka Lekkala,et al.  Text Entry by Gazing and Smiling , 2013, Adv. Hum. Comput. Interact..

[10]  Y. Guiard Asymmetric division of labor in human skilled bimanual action: the kinematic chain as a model. , 1987, Journal of motor behavior.

[11]  Yasushi Hirano,et al.  Reciprocal attentive communication in remote meeting with a humanoid robot , 2007, ICMI '07.

[12]  James Everett Young,et al.  Monocle: Interactive detail-in-context using two pan-and-tilt cameras to improve teleoperation effectiveness , 2017, 2017 26th IEEE International Symposium on Robot and Human Interactive Communication (RO-MAN).

[13]  Annica Kristoffersson,et al.  The Effect of Field of View on Social Interaction in Mobile Robotic Telepresence Systems , 2014, 2014 9th ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[14]  Daniel Ariad Black,et al.  Why Can I See My Avatar? Embodied Visual Engagement in the Third-Person Video Game , 2017, Games Cult..

[15]  Jessie Y. C. Chen,et al.  Human Performance Issues and User Interface Design for Teleoperated Robots , 2007, IEEE Transactions on Systems, Man, and Cybernetics, Part C (Applications and Reviews).

[16]  J. Newman The myth of the ergodic videogame , 2002 .

[17]  Silvia Coradeschi,et al.  A Review of Mobile Robotic Telepresence , 2013, Adv. Hum. Comput. Interact..

[18]  Manuela M. Veloso,et al.  Effective Semi-autonomous Telepresence , 2012, RoboCup.

[19]  Leila Takayama,et al.  Assisted driving of a mobile remote presence system: System design and controlled user evaluation , 2011, 2011 IEEE International Conference on Robotics and Automation.

[20]  Ernest Adams,et al.  Fundamentals of Game Design , 2006 .