Examining interaction with general-purpose object recognition in LEGO OASIS

Improvements in cameras, computer vision, and machine learning are enabling real-time object recognition in interactive systems. Reliable recognition of uninstrumented objects opens up exciting new scenarios using the real-world objects that surround us. At the same time, it introduces the need to understand and manage the uncertainty and ambiguities that are inherent to such sensing. This paper examines this problem in the context of LEGO OASIS, a camera and projector-based system that recognizes LEGO toys and augments them with projected digital content. We focus on an interaction language to model the creation and manipulation of relationships between physical objects and their digital capabilities. We use this set of abstractions to examine different notions of recognition errors and explore interactive approaches to overcoming fundamental challenges in interactive object-aware systems.

[1]  Jacob O. Wobbrock,et al.  Bonfire: a nomadic system for hybrid laptop-tabletop interaction , 2009, UIST '09.

[2]  Scott R. Klemmer,et al.  Papier-Mache: toolkit support for tangible input , 2004, CHI.

[3]  F. Ashcroft,et al.  VIII. References , 1955 .

[4]  Desney S. Tan,et al.  Phosphor: explaining transitions in the user interface using afterglow effects , 2006, UIST.

[5]  Gregory D. Abowd,et al.  Interaction techniques for ambiguity resolution in recognition-based interfaces , 2007, SIGGRAPH '07.

[6]  Hiroshi Ishii,et al.  mediaBlocks: physical containers, transports, and controls for online media , 1998, SIGGRAPH.

[7]  Jerry Alan Fails,et al.  A design tool for camera-based interaction , 2003, CHI '03.

[8]  Hans-Werner Gellersen,et al.  Projected interfaces: enabling serendipitous interaction with smart tangible objects , 2009, Tangible and Embedded Interaction.

[9]  Takeo Igarashi,et al.  Eyepatch: prototyping camera-based interaction through examples , 2007, UIST '07.

[10]  Dieter Fox,et al.  Kernel Descriptors for Visual Recognition , 2010, NIPS.

[11]  Andrew D. Wilson Using a depth camera as a touch sensor , 2010, ITS '10.

[12]  Jun Rekimoto,et al.  SmartSkin: an infrastructure for freehand manipulation on interactive surfaces , 2002, CHI.

[13]  Pierre Wellner The DigitalDesk calculator: tangible manipulation on a desk top display , 1991, UIST '91.

[14]  Hiroshi Ishii,et al.  The metaDESK: models and prototypes for tangible user interfaces , 1997, UIST '97.

[15]  Hiroshi Ishii,et al.  Tangible bits: towards seamless interfaces between people, bits and atoms , 1997, CHI.

[16]  Jun Rekimoto,et al.  Augmented surfaces: a spatially continuous work space for hybrid computing environments , 1999, CHI '99.

[17]  Andrew D. Wilson PlayAnywhere: a compact interactive tabletop projection-vision system , 2005, UIST.

[18]  Alex Olwal,et al.  SurfaceFusion: unobtrusive tracking of everyday objects in tangible user interfaces , 2008, Graphics Interface.