Mirror Puppeteering: Animating Toy Robots in Front of a Webcam

Mirror Puppeteering is a system for easily creating gestures ("animations") for robotic toys, custom robots, and virtual characters. Lay users can record animations by simply moving a robot's limbs in front of a webcam. Makers and hobbyists can use the system to easily set up their custom-built robots for animation. Gamers and amateur animators can real-time control or save animations for virtual characters. Our system works by tracking circular markers on the robot's surface and translating these into motor commands, using a calibration map between marker locations in camera space and motor angles. New robots can be quickly set up for Mirror Puppeteering without knowledge of the robot's 3D structure, as we demonstrate on several robots. In a user study, participants found our method more enjoyable, usable, easy to learn, and successful than traditional animation methods.

[1]  Aude Billard,et al.  Incremental learning of gestures by imitation in a humanoid robot , 2007, 2007 2nd ACM/IEEE International Conference on Human-Robot Interaction (HRI).

[2]  Yuta Sugiura,et al.  An actuated physical puppet as an input device for controlling a digital manikin , 2011, CHI.

[3]  J. Hodgins,et al.  Optimizing Human Motion for the Control of a Humanoid Robot , 2002 .

[4]  Hiroshi Ishii,et al.  Topobo: a constructive assembly system with kinetic memory , 2004, CHI.

[5]  Hiroshi Ishii,et al.  curlybot: designing a new class of computational toys , 2000, CHI.

[6]  Adam Finkelstein,et al.  Video puppetry: a performative interface for cutout animation , 2008, SIGGRAPH 2008.

[7]  Bruce Blumberg,et al.  Sympathetic interfaces: using a plush toy to direct synthetic characters , 1999, CHI '99.

[8]  James Everett Young,et al.  Style by demonstration for interactive robot motion , 2012, DIS '12.

[9]  Jessica K. Hodgins,et al.  Action capture with accelerometers , 2008, SCA '08.

[10]  Yutaka Takase,et al.  Motion generation for the stuffed-toy robot , 2013, The SICE Annual Conference 2013.

[11]  Brett Browning,et al.  A survey of robot learning from demonstration , 2009, Robotics Auton. Syst..

[12]  Peter Kulchyski and , 2015 .

[13]  Maneesh Agrawala,et al.  3D puppetry: a kinect-based interface for 3D animation , 2012, UIST.

[14]  Rajesh P. N. Rao,et al.  Learning full-body motions from monocular vision: dynamic imitation in a humanoid robot , 2007, 2007 IEEE/RSJ International Conference on Intelligent Robots and Systems.

[15]  Yuta Sugiura,et al.  An operating method for a bipedal walking robot for entertainment , 2009, SIGGRAPH ASIA '09.

[16]  Katsu Yamane,et al.  Animating non-humanoid characters with human motion data , 2010, SCA '10.

[17]  Jochen J. Steil,et al.  Kinesthetic teaching of visuomotor coordination for pointing by the humanoid robot iCub , 2013, Neurocomputing.

[18]  Sehoon Ha,et al.  Human motion reconstruction from force sensors , 2011, SCA '11.

[19]  Maya Cakmak,et al.  Keyframe-based Learning from Demonstration , 2012, Int. J. Soc. Robotics.

[20]  Maya Cakmak,et al.  Trajectories and keyframes for kinesthetic teaching: A human-robot interaction perspective , 2012, 2012 7th ACM/IEEE International Conference on Human-Robot Interaction (HRI).