Projected interfaces: enabling serendipitous interaction with smart tangible objects

The Projected Interfaces architecture enables bi-directional user interaction with smart tangible objects. Smart objects function as both input and output devices simultaneously by cooperating with projector-camera systems to achieve a projected display on their surfaces. Tangible manipulation of the object and camera-based tracking allow interaction directly with the projected display. Such hybrid interfaces benefit both from the flexibility offered by the GUI and the intuitiveness of TUI. In this paper we present the theory behind how to consider interaction for projected interfaces with an architecture design and a proof of concept implementation using an augmented photograph album.

[1]  Albrecht Schmidt,et al.  A cube to learn: a tangible user interface for the design of a learning appliance , 2005, Personal and Ubiquitous Computing.

[2]  Claudio S. Pinhanez,et al.  An Architecture and Framework for Steerable Interface Systems , 2003, UbiComp.

[3]  Ali Mazalek,et al.  Turning a page on the digital annotation of physical books , 2008, Tangible and Embedded Interaction.

[4]  Andreas Butz,et al.  A Generalized Peephole Metaphor for Augmented Reality and Instrumented Environments , 2003 .

[5]  G LoweDavid,et al.  Distinctive Image Features from Scale-Invariant Keypoints , 2004 .

[6]  Matthijs C. Dorst Distinctive Image Features from Scale-Invariant Keypoints , 2011 .

[7]  Gerd Kortuem,et al.  Cooperative Artefacts: Assessing Real World Situations with Embedded Technology , 2004, UbiComp.

[8]  Michitaka Hirose,et al.  Projected augmentation - augmented reality using rotatable video projectors , 2004, Third IEEE and ACM International Symposium on Mixed and Augmented Reality.

[9]  Andreas Butz,et al.  TUISTER: a tangible UI for hierarchical structures , 2004, IUI '04.

[10]  Rahul Sukthankar,et al.  Tracking Locations of Moving Hand-Held Displays Using Projected Light , 2005, Pervasive.

[11]  Jay Lee,et al.  Bottles as a minimal interface to access digital information , 2001, CHI Extended Abstracts.

[12]  Albrecht Schmidt,et al.  Ubiquitous Interaction - Using Surfaces in Everyday Environments as Pointing Devices , 2002, User Interfaces for All.

[13]  Pierre David Wellner,et al.  Interacting with paper on the DigitalDesk , 1993, CACM.

[14]  Ramesh Raskar,et al.  Dynamic shader lamps : painting on movable objects , 2001, Proceedings IEEE and ACM International Symposium on Augmented Reality.

[15]  Shree K. Nayar,et al.  A Projector-Camera System with Real-Time Photometric Adaptation for Dynamic Environments , 2005, CVPR.

[16]  Bernt Schiele,et al.  Vision-Based Detection of Mobile Smart Objects , 2008, EuroSSC.

[17]  Scott E. Hudson,et al.  Moveable interactive projected displays using projector based tracking , 2005, UIST.

[18]  Bernt Schiele,et al.  Cooperative Augmentation of Smart Objects with Projector-Camera Systems , 2007, UbiComp.

[19]  Ivan Poupyrev,et al.  The MagicBook: a transitional AR interface , 2001, Comput. Graph..

[20]  Michael Beigl,et al.  The particle computer system , 2005, IPSN 2005. Fourth International Symposium on Information Processing in Sensor Networks, 2005..

[21]  Robert C. Bolles,et al.  Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography , 1981, CACM.