Steerable augmented reality with the beamatron

Steerable displays use a motorized platform to orient a projector to display graphics at any point in the room. Often a camera is included to recognize markers and other objects, as well as user gestures in the display volume. Such systems can be used to superimpose graphics onto the real world, and so are useful in a number of augmented reality and ubiquitous computing scenarios. We contribute the Beamatron, which advances steerable displays by drawing on recent progress in depth camera-based interactions. The Beamatron consists of a computer-controlled pan and tilt platform on which is mounted a projector and Microsoft Kinect sensor. While much previous work with steerable displays deals primarily with projecting corrected graphics onto a discrete set of static planes, we describe computational techniques that enable reasoning in 3D using live depth data. We show two example applications that are enabled by the unique capabilities of the Beamatron: an augmented reality game in which a player can drive a virtual toy car around a room, and a ubiquitous computing demo that uses speech and gesture to move projected graphics throughout the room.

[1]  P M Ngan,et al.  Calibrating a Pan-tilt Camera Head , 1995 .

[2]  Xiang Cao,et al.  Interactive Environment-Aware Handheld Projectors for Pervasive Computing Spaces , 2012, Pervasive.

[3]  Xiang Cao,et al.  Multi-user interaction using handheld projectors , 2007, UIST.

[4]  Hrvoje Benko,et al.  Multi-point interactions with immersive omnidirectional visualizations in a dome , 2010, ITS '10.

[5]  Andrew W. Fitzgibbon,et al.  KinectFusion: real-time 3D reconstruction and interaction using a moving depth camera , 2011, UIST.

[6]  Hiroshi Ishii,et al.  Emancipated pixels: real-world graphics in the luminous room , 1999, SIGGRAPH.

[7]  Andreas Butz,et al.  Applying the peephole metaphor in a mixed-reality room , 2006, IEEE Computer Graphics and Applications.

[8]  Chris Harrison,et al.  OmniTouch: wearable multitouch interaction everywhere , 2011, UIST.

[9]  Greg Welch,et al.  Shader Lamps: Animating Real Objects With Image-Based Illumination , 2001, Rendering Techniques.

[10]  Michitaka Hirose,et al.  Projected augmentation - augmented reality using rotatable video projectors , 2004, Third IEEE and ACM International Symposium on Mixed and Augmented Reality.

[11]  Zhengyou Zhang,et al.  A Flexible New Technique for Camera Calibration , 2000, IEEE Trans. Pattern Anal. Mach. Intell..

[12]  Ivan Poupyrev,et al.  SideBySide: ad-hoc multi-user interaction with handheld projectors , 2011, UIST.

[13]  Jun Rekimoto,et al.  Augmented surfaces: a spatially continuous work space for hybrid computing environments , 1999, CHI '99.

[14]  Hrvoje Benko,et al.  LightGuide: projected visualizations for hand movement guidance , 2012, CHI.

[15]  W. Brent Seales,et al.  Multi-projector displays using camera-based registration , 1999, Proceedings Visualization '99 (Cat. No.99CB37067).

[16]  Andreas Butz,et al.  SearchLight - A Lightweight Search Function for Pervasive Environments , 2004, Pervasive.

[17]  T. H. I. Jaakola,et al.  Optimization by direct search and systematic reduction of the size of search region , 1973 .

[18]  Mark Billinghurst,et al.  Shared space: An augmented reality approach for computer supported collaborative work , 1998, Virtual Reality.

[19]  Paul A. Beardsley,et al.  Natural video matting using camera arrays , 2006, ACM Trans. Graph..

[20]  Hrvoje Benko,et al.  Combining multiple depth cameras and projectors for interactions on, above and between surfaces , 2010, UIST.

[21]  A.D. Wilson Depth-Sensing Video Cameras for 3D Tangible Tabletop Interaction , 2007, Second Annual IEEE International Workshop on Horizontal Interactive Human-Computer Systems (TABLETOP'07).

[22]  Claudio S. Pinhanez,et al.  Interacting with steerable projected displays , 2002, Proceedings of Fifth IEEE International Conference on Automatic Face Gesture Recognition.

[23]  Richard A. Bolt,et al.  “Put-that-there”: Voice and gesture at the graphics interface , 1980, SIGGRAPH '80.

[24]  Andrew Wilson,et al.  MirageTable: freehand interaction on a projected augmented reality tabletop , 2012, CHI.

[25]  Claudio S. Pinhanez The Everywhere Displays Projector: A Device to Create Ubiquitous Graphical Interfaces , 2001, UbiComp.

[26]  Ramesh Raskar,et al.  Modern approaches to augmented reality: introduction to current approaches , 2006, SIGGRAPH Courses.

[27]  Sriram Subramanian,et al.  Steerable projection: exploring alignment in interactive mobile displays , 2011, Personal and Ubiquitous Computing.

[28]  James L. Crowley,et al.  Spatial Control of Interactive Surfaces in an Augmented Environment , 2004, EHCI/DS-VIS.

[29]  Greg Welch,et al.  The office of the future: a unified approach to image-based modeling and spatially immersive displays , 1998, SIGGRAPH.

[30]  Bernt Schiele,et al.  Cooperative Augmentation of Smart Objects with Projector-Camera Systems , 2007, UbiComp.