You as a Puppet: Evaluation of Telepresence User Interface for Puppetry

We propose an immersive telepresence system for puppetry that transmits a human performer's body and facial movements into a puppet with audiovisual feedback to the performer. The cameras carried in place of puppet's eyes stream live video to the HMD worn by the performer, so that performers can see the images from the puppet's eyes with their own eyes and have a visual understanding of the puppet's ambience. In conventional methods to manipulate a puppet (a hand-puppet, a string-puppet, and a rod-puppet), there is a need to practice manipulating puppets, and there is difficulty carrying out interactions with the audience. Moreover, puppeteers must be positioned exactly where the puppet is. The proposed system addresses these issues by enabling a human performer to manipulate the puppet remotely using his or her body and facial movements. We conducted several user studies with both beginners and professional puppeteers. The results show that, unlike the conventional method, the proposed system facilitates the manipulation of puppets especially for beginners. Moreover, this system allows performers to enjoy puppetry and fascinate audiences.

[1]  Yuta Sugiura,et al.  PINOKY: a ring that animates your plush toys , 2011, SA '11.

[2]  Yoichi Ochiai,et al.  Yadori: mask-type user interface for manipulation of puppets , 2016, SIGGRAPH Emerging Technologies.

[3]  Anna Gruebler,et al.  Measurement of distal EMG signals using a wearable device for reading facial expressions , 2010, 2010 Annual International Conference of the IEEE Engineering in Medicine and Biology.

[4]  Jun Rekimoto,et al.  ChameleonMask: Embodied Physical and Social Telepresence using Human Surrogates , 2015, CHI Extended Abstracts.

[5]  Eileen Blumenthal Puppetry and Puppets: An Illustrated World Survey , 2005 .

[6]  Chongyang Ma,et al.  Facial performance sensing head-mounted display , 2015, ACM Trans. Graph..

[7]  Pertti Roivainen,et al.  3-D Motion Estimation in Model-Based Facial Image Coding , 1993, IEEE Trans. Pattern Anal. Mach. Intell..

[8]  Masahiko Inami,et al.  RobotPHONE: RUI for interpersonal communication , 2001, CHI Extended Abstracts.

[9]  C. Schaefer,et al.  Family Play Therapy , 1994 .

[10]  Yoichi Ochiai,et al.  Transformed Human Presence for Puppetry , 2016, ACE.

[11]  Tomoko Yonezawa,et al.  Musically expressive doll in face-to-face communication , 2002, Proceedings. Fourth IEEE International Conference on Multimodal Interfaces.

[12]  Matthew Brand,et al.  Voice puppetry , 1999, SIGGRAPH.

[13]  Cynthia Breazeal,et al.  The design of a semi-autonomous robot avatar for family communication and education , 2008, RO-MAN 2008 - The 17th IEEE International Symposium on Robot and Human Interactive Communication.

[14]  Maneesh Agrawala,et al.  3D puppetry: a kinect-based interface for 3D animation , 2012, UIST.

[15]  Luc Van Gool,et al.  Face/Off: live facial puppetry , 2009, SCA '09.

[16]  Tetsuo Ono,et al.  Android as a telecommunication medium with a human-like presence , 2007, 2007 2nd ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[17]  Kumiko Kushiyama,et al.  mimicat: face input interface supporting animatronics costume performer's facial expression , 2012, SIGGRAPH '12.

[18]  Masaki Oshita,et al.  Character motion control interface with hand manipulation inspired by puppet mechanism , 2013, VRCAI '13.

[19]  Joseph J. LaViola,et al.  A discussion of cybersickness in virtual environments , 2000, SGCH.

[20]  Kai Kunze,et al.  AffectiveWear: toward recognizing facial expression , 2015, SIGGRAPH Posters.

[21]  Jan Peters,et al.  First-person tele-operation of a humanoid robot , 2015, 2015 IEEE-RAS 15th International Conference on Humanoid Robots (Humanoids).

[22]  Jessica K. Hodgins,et al.  A tongue input device for creating conversations , 2011, UIST '11.

[23]  Greg Welch,et al.  Animatronic Shader Lamps Avatars , 2009, ISMAR.

[24]  Jack Tierney Puppetry in early childhood education , 1995 .

[25]  Emil Polyak Virtual impersonation using interactive glove puppets , 2012, SA '12.

[26]  Ariel Shamir,et al.  Mirror Puppeteering: Animating Toy Robots in Front of a Webcam , 2015, TEI.

[27]  Jessica K. Hodgins,et al.  A hybrid hydrostatic transmission and human-safe haptic telepresence robot , 2016, 2016 IEEE International Conference on Robotics and Automation (ICRA).

[28]  Leonardo Bonanni,et al.  PlayPals: tangible interfaces for remote communication and play , 2006, CHI EA '06.

[29]  Ajmal S. Mian,et al.  Using Kinect for face recognition under varying poses, expressions, illumination and disguise , 2013, 2013 IEEE Workshop on Applications of Computer Vision (WACV).

[30]  Jun Rekimoto,et al.  LiveSphere: Sharing the Surrounding Visual Environment for Immersive Experience in Remote Collaboration , 2015, Tangible and Embedded Interaction.

[31]  Verónica Orvalho,et al.  Mani-Pull-Action: Hand-based Digital Puppetry , 2017, PACMHCI.

[32]  Verónica Orvalho,et al.  Shape your body: control a virtual silhouette using body motion , 2012, CHI EA '12.