MouseLight: bimanual interactions on digital paper using a pen and a spatially-aware mobile projector

MouseLight is a spatially-aware standalone mobile projector with the form factor of a mouse that can be used in combination with digital pens on paper. By interacting with the projector and the pen bimanually, users can visualize and modify the virtually augmented contents on top of the paper, and seamlessly transition between virtual and physical information. We present a high fidelity hardware prototype of the system and demonstrate a set of novel interactions specifically tailored to the unique properties of MouseLight. MouseLight differentiates itself from related systems such as PenLight in two aspects. First, MouseLight presents a rich set of bimanual interactions inspired by the ToolGlass interaction metaphor, but applied to physical paper. Secondly, our system explores novel displaced interactions, that take advantage of the independent input and output that is spatially aware of the underneath paper. These properties enable users to issue remote commands such as copy and paste or search. We also report on a preliminary evaluation of the system which produced encouraging observations and feedback.

[1]  Patrick Baudisch,et al.  Design and analysis of delimiters for selection-action pen gesture phrases in scriboli , 2005, CHI.

[2]  Pattie Maes,et al.  WUW - wear Ur world: a wearable gestural interface , 2009, CHI Extended Abstracts.

[3]  W. Buxton,et al.  Boom chameleon: simultaneous capture of 3D viewpoint, voice and gesture annotations on a spatially-aware display , 2002, UIST '02.

[4]  Pierre Dragicevic,et al.  Video browsing by direct manipulation , 2008, CHI.

[5]  James D. Hollan,et al.  Papiercraft: A gesture-based command system for interactive paper , 2008, TCHI.

[6]  Paul A. Beardsley,et al.  Pokey: Interaction through covert structured light , 2008, 2008 3rd IEEE International Workshop on Horizontal Interactive Human Computer Systems.

[7]  Chunyuan Liao,et al.  Pen-top feedback for paper-based interfaces , 2006, UIST.

[8]  K. Ohara Micro Vision , 2007 .

[9]  Patrick Baudisch,et al.  Halo: a technique for visualizing off-screen objects , 2003, CHI '03.

[10]  James M. Rehg,et al.  Projector-guided painting , 2006, UIST.

[11]  Jun Rekimoto,et al.  Pick-and-drop: a direct manipulation technique for multiple computer environments , 1997, UIST '97.

[12]  Paul A. Beardsley,et al.  Natural video matting using camera arrays , 2006, ACM Trans. Graph..

[13]  Wendy E. Mackay,et al.  Ariel: augmenting paper engineering drawings , 1995, CHI 95 Conference Companion.

[14]  Karan Singh,et al.  Interactive curve design using digital French curves , 1999, SI3D.

[15]  Scott R. Klemmer,et al.  ButterflyNet: a mobile capture and access system for field biology research , 2006, CHI.

[16]  Beat Signer,et al.  PaperPoint: a paper-based presentation and interactive paper prototyping tool , 2007, Tangible and Embedded Interaction.

[17]  Tovi Grossman,et al.  PenLight: combining a mobile projector and a digital pen for dynamic visual overlay , 2009, CHI.

[18]  Xiang Cao,et al.  Mouse 2.0: multi-touch meets the mouse , 2009, UIST '09.

[19]  Nadir Weibel,et al.  Paperproof: a paper-digital proof-editing system , 2008, CHI Extended Abstracts.

[20]  Ka-Ping Yee,et al.  Peephole displays: pen interaction on spatially aware handheld computers , 2003, CHI '03.

[21]  George W. Fitzmaurice,et al.  Situated information spaces and spatially aware palmtop computers , 1993, CACM.

[22]  William Buxton,et al.  Boom chameleon: simultaneous capture of 3D viewpoint, voice and gesture annotations on a spatially-aware display , 2002, UIST '02.

[23]  Gordon Kurtenbach,et al.  The design and evaluation of marking menus , 1993 .

[24]  Matthias Rauterberg,et al.  A computer support tool for the early stages of architectural design , 2006, Interact. Comput..

[25]  Christine Reid,et al.  The Myth of the Paperless Office , 2003, J. Documentation.

[26]  Paul K. Wright,et al.  Toolglasses, Marking Menus, and Hotkeys: A Comparison of One and Two-Handed Command Selection Techniques , 2004, Graphics Interface.

[27]  Jacob O. Wobbrock,et al.  Bonfire: a nomadic system for hybrid laptop-tabletop interaction , 2009, UIST '09.

[28]  Pierre David Wellner,et al.  Interacting with paper on the DigitalDesk , 1993, CACM.

[29]  Ravin Balakrishnan,et al.  The PadMouse: facilitating selection and spatial positioning for the non-dominant hand , 1998, CHI.

[30]  Hod Lipson,et al.  ModelCraft: capturing freehand annotations and edits on physical 3D models , 2006, UIST.

[31]  Wendy E. Mackay,et al.  The missing link: augmenting biology laboratory notebooks , 2002, UIST '02.

[32]  Xiang Cao,et al.  Interacting with dynamically defined information spaces using a handheld projector and a pen , 2006, UIST.

[33]  Jun Rekimoto,et al.  Augmented surfaces: a spatially continuous work space for hybrid computing environments , 1999, CHI '99.

[34]  Wendy E. Mackay,et al.  Musink: composing music through augmented drawing , 2009, CHI.

[35]  Richard Turner,et al.  The myth of the paperless office , 2001 .

[36]  Tony DeRose,et al.  Toolglass and magic lenses: the see-through interface , 1993, SIGGRAPH.

[37]  William Buxton,et al.  The design of a GUI paradigm based on tablets, two-hands, and transparency , 1997, CHI.

[38]  Andrew D. Wilson PlayAnywhere: a compact interactive tabletop projection-vision system , 2005, UIST.