Towards User-Aware Multi-touch Interaction Layer for Group Collaborative Systems

State-of-the-art collaborative workspaces are represented either by large tabletops or wall-sized interactive displays. Extending bare multi-touch capability with metadata for association of touch events to individual users could significantly improve collaborative work of co-located group. In this paper, we present several techniques which enable development of such interactive environments. First, we describe an algorithm for scalable coupling of multiple touch sensors and a method allowing association of touch events with users. Further, we briefly discuss the Multi-Sensor (MUSE) framework which utilizes the two techniques and allows rapid development of touch-based user interface. Finally, we discuss the preliminary results of the prototype implementation.

[1]  Jason Leigh,et al.  Cyber-commons: merging real and virtual worlds , 2008, CACM.

[2]  James R. Eagan,et al.  Shared substance: developing flexible multi-surface applications , 2011, CHI.

[3]  Richard May,et al.  A Survey of Large High-Resolution Display Technologies, Techniques, and Applications , 2006, IEEE Virtual Reality Conference (VR 2006).

[4]  Toby Sharp,et al.  Real-time human pose recognition in parts from single depth images , 2011, CVPR.

[5]  Dominik Schmidt,et al.  HandsDown: hand-contour-based user identification for interactive surfaces , 2010, NordiCHI.

[6]  Ching Y. Suen,et al.  A fast parallel algorithm for thinning digital patterns , 1984, CACM.

[7]  Kai Li,et al.  Tech-note: Device-free interaction spaces , 2009, 2009 IEEE Symposium on 3D User Interfaces.

[8]  Luc Renambot,et al.  Enabling multi-user interaction in large high-resolution distributed environments , 2011, Future Gener. Comput. Syst..

[9]  Tovi Grossman,et al.  Medusa: a proximity-aware multi-touch tabletop , 2011, UIST.

[10]  Patrick Baudisch,et al.  Bootstrapper: recognizing tabletop users by their shoes , 2012, CHI.

[11]  Luc Renambot,et al.  SAGE: the Scalable Adaptive Graphics Environment , 2004 .

[12]  Radu-Daniel Vatavu,et al.  Detecting and Tracking Multiple Users in the Proximity of Interactive Tabletops , 2008 .

[13]  Jean-Yves Lionel Lawson,et al.  The openinterface framework: a tool for multimodal interaction. , 2008, CHI Extended Abstracts.

[14]  Kyoung Shin Park,et al.  Design and Development of a Distributed Tabletop System Using EBITA Framework , 2009, Proceedings of the 4th International Conference on Ubiquitous Information Technologies & Applications.

[15]  H. James Hoover,et al.  InTml: a description language for VR applications , 2002, Web3D '02.

[16]  Hrvoje Benko,et al.  Combining multiple depth cameras and projectors for interactions on, above and between surfaces , 2010, UIST.

[17]  Tom Wypych,et al.  CGLXTouch: A multi-user multi-touch approach for ultra-high-resolution collaborative workspaces , 2011, Future Gener. Comput. Syst..

[18]  Chun-Jen Chen,et al.  A linear-time component-labeling algorithm using contour tracing technique , 2004, Comput. Vis. Image Underst..

[19]  Darren Leigh,et al.  DiamondTouch: a multi-user touch technology , 2001, UIST '01.

[20]  Sharath Pankanti,et al.  Appearance models for occlusion handling , 2006, Image Vis. Comput..