ShadowPuppets: supporting collocated interaction with mobile projector phones using hand shadows

Pico projectors attached to mobile phones allow users to view phone content using a large display. However, to provide input to projector phones, users have to look at the device, diverting their attention from the projected image. Additionally, other collocated users have no way of interacting with the device. We present ShadowPuppets, a system that supports collocated interaction with mobile projector phones. ShadowPuppets allows users to cast hand shadows as input to mobile projector phones. Most people understand how to cast hand shadows, which provide an easy input modality. Additionally, they implicitly support collocated usage, as nearby users can cast shadows as input and one user can see and understand another user's hand shadows. We describe the results of three user studies. The first study examines what hand shadows users expect will cause various effects. The second study looks at how users perceive hand shadows, examining what effects they think various hand shadows will cause. Finally, we present qualitative results from a study with our functional prototype and discuss design implications for systems using shadows as input. Our findings suggest that shadow input can provide a natural and intuitive way of interacting with projected interfaces and can support collocated collaboration.

[1]  Andreas Butz,et al.  Touch projector: mobile interaction through video , 2010, CHI.

[2]  John C. Tang,et al.  VideoWhiteboard: video shadows to support remote collaboration , 1991, CHI.

[3]  Desney S. Tan,et al.  Skinput: appropriating the body as an input surface , 2010, CHI.

[4]  Cristina Manresa-Yee,et al.  Hand Tracking and Gesture Recognition for Human-Computer Interaction , 2009, Progress in Computer Vision and Image Analysis.

[5]  Masahiro Takatsuka,et al.  Initial evaluation of a bare-hand interaction technique for large displays using a webcam , 2009, EICS '09.

[6]  Nadir Weibel,et al.  Projector phone use: practices and social implications , 2011, Personal and Ubiquitous Computing.

[7]  Andrew D. Wilson Robust computer vision-based detection of pinching for one and two-handed gesture input , 2006, UIST.

[8]  Paul A. Beardsley,et al.  Handheld Projectors for Mixing Physical and Digital Textures , 2005, 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'05) - Workshops.

[9]  Jakub Segen,et al.  Shadow gestures: 3D hand pose estimation using a single camera , 1999, Proceedings. 1999 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (Cat. No PR00149).

[10]  Enrico Rukzio,et al.  Evaluation of picture browsing using a projector phone , 2008, Mobile HCI.

[11]  Jacob O. Wobbrock,et al.  Bonfire: a nomadic system for hybrid laptop-tabletop interaction , 2009, UIST '09.

[12]  Scott R. Klemmer,et al.  Two worlds apart: bridging the gap between physical and virtual media for distributed design collaboration , 2003, CHI '03.

[13]  Carl Gutwin,et al.  The effects of co-present embodiments on awareness and collaboration in tabletop groupware , 2008, Graphics Interface.

[14]  Claudio S. Pinhanez The Everywhere Displays Projector: A Device to Create Ubiquitous Graphical Interfaces , 2001, UbiComp.

[15]  Meredith Ringel Morris,et al.  User-defined gestures for surface computing , 2009, CHI.

[16]  Xiang Cao,et al.  Interacting with dynamically defined information spaces using a handheld projector and a pen , 2006, UIST.

[17]  Masanori Sugimoto,et al.  Hotaru: Intuitive Manipulation Techniques for Projected Displays of Mobile Devices , 2005, INTERACT.

[18]  Tovi Grossman,et al.  PenLight: combining a mobile projector and a digital pen for dynamic visual overlay , 2009, CHI.

[19]  Larry S. Davis,et al.  A Robust Background Subtraction and Shadow Detection , 1999 .

[20]  Patrick Baudisch,et al.  Imaginary interfaces: spatial interaction with empty hands and without visual feedback , 2010, UIST.

[21]  Gudrun Klinker,et al.  Shadow tracking on multi-touch tables , 2008, AVI '08.

[22]  Christopher O. Jaynes,et al.  Epipolar Contrained User Pushbutton Selection in Projected Interfaces , 2004, 2004 Conference on Computer Vision and Pattern Recognition Workshop.

[23]  Pattie Maes,et al.  WUW - wear Ur world: a wearable gestural interface , 2009, CHI Extended Abstracts.

[24]  Anthony Tang,et al.  Shadow reaching: a new perspective on interaction for large displays , 2007, UIST.

[25]  Carman Neustaedter,et al.  VideoArms : Embodiments for Mixed Presence Groupware Anthony Tang Human , 2006 .

[26]  Hiroshi Ishii,et al.  Iterative design of seamless collaboration media , 1994, CACM.

[27]  Donald A. Norman,et al.  Human-centered design considered harmful , 2005, INTR.

[28]  Andreas Butz,et al.  Interactions in the air: adding further depth to interactive tabletops , 2009, UIST '09.

[29]  Tovi Grossman,et al.  MouseLight: bimanual interactions on digital paper using a pen and a spatially-aware mobile projector , 2010, CHI.

[30]  Meredith Ringel Morris,et al.  ShadowGuides: visualizations for in-situ learning of multi-touch and whole-hand gestures , 2009, ITS '09.

[31]  Xiang Cao,et al.  Multi-user interaction using handheld projectors , 2007, UIST.

[32]  Martin C. Emele,et al.  SPOTLIGHT NAVIGATION: INTERACTION WITH A HANDHELD PROJECTION DEVICE , 2004 .

[33]  Hayes Raffle,et al.  Social immersive media: pursuing best practices for multi-user interactive camera/projector exhibits , 2009, CHI.