Extending touch: towards interaction with large-scale surfaces

Touch is a very intuitive modality for interacting with objects displayed on arbitrary surfaces. However, when using touch for large-scale surfaces, not every point is reachable. Therefore, an extension is required that keeps the intuitivity provided by touch: pointing. We will present our system that allows both input modalities in one single framework. Our method is based on 3D reconstruction, using standard RGB cameras only, and allows seamless switching between touch and pointing, even while interacting. Our approach scales very well with large surfaces without modifying them. We present a technical evaluation of the system's accuracy, as well as a user study. We found that users preferred our system to a touch-only system, because they had more freedom during interaction and could solve the presented task significantly faster.

[1]  Regan L. Mandryk,et al.  Integrating Point and Touch for Interaction with Digital Tabletop Displays , 2006, IEEE Computer Graphics and Applications.

[2]  Anthony Tang,et al.  Shadow reaching: a new perspective on interaction for large displays , 2007, UIST.

[3]  J. Rekimoto,et al.  Perceptual Surfaces : Towards a Human and Object Sensitive Interactive Display , 1997 .

[4]  Paul Farris,et al.  The Apple Iphone , 2009, SSRN Electronic Journal.

[5]  A. Laurentini,et al.  The Visual Hull Concept for Silhouette-Based Image Understanding , 1994, IEEE Trans. Pattern Anal. Mach. Intell..

[6]  Beth Levy,et al.  Conceptual Representations in Lan-guage Activity and Gesture , 1980 .

[7]  H. Koike,et al.  3-D Interaction with Wall-Sized Display and Information Transportation using Mobile Phones , 2008 .

[8]  Johannes Schöning,et al.  Multi-Touch Surfaces: A Technical Guide , 2008 .

[9]  Hrvoje Benko,et al.  Using Depth-Sensing Camera to Enable Freehand Interactions On and Above the Interactive Surface , 2008 .

[10]  Antti Oulasvirta,et al.  It's Mine, Don't Touch!: interactions at a large multi-touch display in a city centre , 2008, CHI.

[11]  Kellogg S. Booth,et al.  Shadow Reaching : A New Perspective on Interaction for Large Wall Displays , 2007 .

[12]  Xiaojun Bi,et al.  uPen: a smart pen-liked device for facilitating interaction on large displays , 2006, First IEEE International Workshop on Horizontal Interactive Human-Computer Systems (TABLETOP '06).

[13]  Rainer Stiefelhagen,et al.  Pointing gesture recognition based on 3D-tracking of face, hands and head orientation , 2003, ICMI '03.

[14]  Jefferson Y. Han Low-cost multi-touch sensing through frustrated total internal reflection , 2005, UIST.

[15]  A. Agarwal,et al.  High Precision Multi-touch Sensing on Surfaces using Overhead Cameras , 2007, Second Annual IEEE International Workshop on Horizontal Interactive Human-Computer Systems (TABLETOP'07).

[16]  Rainer Stiefelhagen,et al.  Real-Time GPU-Based Voxel Carving with Systematic Occlusion Handling , 2009, DAGM-Symposium.