An actuated physical puppet as an input device for controlling a digital manikin

We present an actuated handheld puppet system for controlling the posture of a virtual character. Physical puppet devices have been used in the past to intuitively control character posture. In our research, an actuator is added to each joint of such an input device to provide physical feedback to the user. This enhancement offers many benefits. First, the user can upload pre-defined postures to the device to save time. Second, the system is capable of dynamically adjusting joint stiffness to counteract gravity, while allowing control to be maintained with relatively little force. Third, the system supports natural human body behaviors, such as whole-body reaching and joint coupling. This paper describes the user interface and implementation of the proposed technique and reports the results of expert evaluation. We also conducted two user studies to evaluate the effectiveness of our method.

[1]  Brian Knep,et al.  Dinosaur input device , 1995, CHI '95.

[2]  Aaron Hertzmann,et al.  Style-based inverse kinematics , 2004, ACM Trans. Graph..

[3]  Masahiko Inami,et al.  A teddy-bear-based robotic user interface , 2006, CIE.

[4]  Jessica K. Hodgins,et al.  Performance animation from low-dimensional control signals , 2005, ACM Trans. Graph..

[5]  Bruce Blumberg,et al.  Sympathetic interfaces: using a plush toy to direct synthetic characters , 1999, CHI '99.

[6]  W. T. Dempster,et al.  SPACE REQUIREMENTS OF THE SEATED OPERATOR, GEOMETRICAL, KINEMATIC, AND MECHANICAL ASPECTS OF THE BODY WITH SPECIAL REFERENCE TO THE LIMBS , 1955 .

[7]  Alessandro De Luca,et al.  Learning gravity compensation in robots: Rigid arms, elastic joints, flexible links , 1993 .

[8]  Norberto F. Ezquerra,et al.  Interactively deformable models for surgery simulation , 1993, IEEE Computer Graphics and Applications.

[9]  Masahiko Inami,et al.  RobotPHONE: RUI for interpersonal communication , 2001, CHI Extended Abstracts.

[10]  Ken Hinckley,et al.  Haptic issues for virtual manipulation , 1997 .

[11]  John F. Hughes,et al.  Sculpting: an interactive volumetric modeling technique , 1991, SIGGRAPH.

[12]  Hiroshi Ishii,et al.  Topobo: a constructive assembly system with kinetic memory , 2004, CHI.

[13]  Tomohiko Mukai,et al.  Geostatistical motion interpolation , 2005, SIGGRAPH '05.

[14]  Kiyoshi Arai Cyber Bunraku , 1997, SIGGRAPH '97.

[15]  James Davis,et al.  Motion capture data retrieval using an artist’s doll , 2008, 2008 19th International Conference on Pattern Recognition.

[16]  Masahiko Inami,et al.  Teddy-bear based robotic user interface , 2005, ACE '05.

[17]  Ellen Yi-Luen Do,et al.  Posey: instrumenting a poseable hub and strut construction toy , 2008, Tangible and Embedded Interaction.

[18]  Chris Esposito,et al.  Of mice and monkeys: a specialized input device for virtual body animation , 1995, I3D '95.

[19]  William V. Baxter,et al.  DAB: Interactive Haptic Painting with 3D Virtual Brushes , 2001, SIGGRAPH Courses.

[20]  De,et al.  LEARNING GRAVITY COMPENSATION IN ROBOTS : RIGID ARMS , ELASTIC JOINTS , FLEXIBLE LINKS , 2006 .

[21]  Peter-Pike J. Sloan,et al.  Artist‐Directed Inverse‐Kinematics Using Radial Basis Function Interpolation , 2001, Comput. Graph. Forum.

[22]  Vincent G. Duffy,et al.  Handbook of Digital Human Modeling: Research for Applied Ergonomics and Human Factors Engineering , 2008 .

[23]  Mira Dontcheva,et al.  Layered acting for character animation , 2003, ACM Trans. Graph..