Building a gesture based information terminal

In this thesis a Multi-Pointer Virtual Touchscreen system is developed, which allows the application of pointing gestures in a standard GUI interface. Unlike in most related projects, an acoustic tap detection method supports the optical tracking. The system is designed as an information terminal for public places, though other applications are possible. Its uses a projected beamer screen as output. Users can control several pointers, similar to mouse pointers in a standard desktop interface. Selection of objects and drag-and-drop actions can be done by tapping the table surface with the finger tip. The optical tracking system derives one pointer position per hand. A novel Multi-Pointer X-Server (MPX) is utilized and configured to handle these coordinates similar to normal mouse input. The new architecture provides the ability to display several mouse cursors and supports Multi-Pointer aware applications. These applications handle the input independently and allow simultaneous actions. Aside from that, standard applications can be operated in conventional manner. For detection of surface touches, a tangible acoustic interface is applied. Tap locations are distinguished implementing Time Difference of Arrival (TDOA) estimation. The mathematical basis for this approach is the Generalized Cross Correlation with Phase Transform (GCC-PHAT). Employing only a stereo audio input allows differentiation of tap locations, though archived accuracy is still limited.

[1]  Pierre David Wellner,et al.  Interacting with paper on the DigitalDesk , 1993, CACM.

[2]  Li Bai,et al.  Virtual Touch Screen: a vision-based interactive surface , 2007 .

[3]  Jefferson Y. Han Low-cost multi-touch sensing through frustrated total internal reflection , 2005, UIST.

[4]  William Buxton,et al.  A three-state model of graphical input , 1990, INTERACT.

[5]  Pietro Polotti,et al.  Tangible Acoustic Interfaces and their Applications for the Design of New Musical Instruments , 2005, NIME.

[6]  Jun Rekimoto,et al.  SmartSkin: an infrastructure for freehand manipulation on interactive surfaces , 2002, CHI.

[7]  M. Sheelagh T. Carpendale,et al.  Rotation and translation mechanisms for tabletop interaction , 2006, First IEEE International Workshop on Horizontal Interactive Human-Computer Systems (TABLETOP '06).

[8]  Bruce H. Thomas,et al.  Groupware Support in the Windowing System , 2007, AUIC.

[9]  William Buxton,et al.  Issues and techniques in touch-sensitive tablet input , 1985, SIGGRAPH '85.

[10]  G. Carter,et al.  The generalized correlation method for estimation of time delay , 1976 .

[11]  Trond Nilsen Tankwar: AR games at GenCon Indy 2005 , 2005, ICAT '05.

[12]  Ying Yu,et al.  A Real-Time SRP-PHAT Source Location Implementation using Stochastic Region Contraction(SRC) on a Large-Aperture Microphone Array , 2007, 2007 IEEE International Conference on Acoustics, Speech and Signal Processing - ICASSP '07.

[13]  Bernhard P. Wrobel,et al.  Multiple View Geometry in Computer Vision , 2001 .

[14]  Bill Buxton,et al.  Multi-Touch Systems that I Have Known and Loved , 2009 .

[15]  Joseph A. Paradiso,et al.  Passive acoustic knock tracking for interactive windows , 2002, CHI Extended Abstracts.

[16]  Andrew D. Wilson PlayAnywhere: a compact interactive tabletop projection-vision system , 2005, UIST.

[17]  S. Catheline,et al.  Acoustic pattern registration for a new type of human-computer interface , 2005 .

[18]  C. Düsing,et al.  Berührbare akustische Benutzerschnittstellen , 2004 .

[19]  Ying Sun,et al.  Acoustic Source Localization for Human Computer Interaction , 2006, SPPRA.

[20]  Gudrun Klinker,et al.  Shadow tracking on multi-touch tables , 2008, AVI '08.

[21]  W. Buxton Human-Computer Interaction , 1988, Springer Berlin Heidelberg.