Ambiguity in Multimodal Interaction with Multi-touch Multi-user Graphics Tables

Graphics surfaces have a great potential for supporting cooperative work provided they can accommodate a large enough team. However, using large multi-user multi-touch tables leads to a problem of ambiguity as to know who is actually interacting with the table, since gestures are anonymous. This problem is even more severe when one introduces multimodal interaction for example through vocal channels. At UTC we have built a system including a large graphics table and peripheral devices for supporting preliminary cooperative design using multimodal interaction. The paper relates the ambiguity problems that we encountered and how we are trying to solve them.

[1]  Stacey D. Scott,et al.  Proceedings of the 2013 ACM international conference on Interactive tabletops and surfaces , 2009 .

[2]  Patrick Baudisch,et al.  Bootstrapper: recognizing tabletop users by their shoes , 2012, CHI.

[3]  Morten Fjeld,et al.  Introduction: A Short History of Tabletop Research, Technologies, and Products , 2010, Tabletops.

[4]  Dominik Schmidt,et al.  HandsDown: hand-contour-based user identification for interactive surfaces , 2010, NordiCHI.

[5]  Sergi Jordà,et al.  The reacTable*: A Collaborative Musical Instrument , 2006, 15th IEEE International Workshops on Enabling Technologies: Infrastructure for Collaborative Enterprises (WETICE'06).

[6]  Martina Ziefle,et al.  A pattern language for interactive tabletops in collaborative workspaces , 2010, EuroPLoP '10.

[7]  Andreas Butz,et al.  Organizational Design , 2014, Encyclopedia of Social Network Analysis and Mining.

[8]  Anthony Collins,et al.  Who did what? Who said that?: Collaid: an environment for capturing traces of collaborative learning at the tabletop , 2011, ITS '11.

[9]  N. Negroponte Agents: from direct manipulation to delegation , 1997 .

[10]  Marco Winckler,et al.  Human-Computer Interaction - INTERACT 2011 - 13th IFIP TC 13 International Conference, Lisbon, Portugal, September 5-9, 2011, Proceedings, Part III , 2011, INTERACT.

[11]  Derrick J. Parkhurst,et al.  Enhancing Multi-user Interaction with Multi-touch Tabletop Displays Using Hand Tracking , 2008, First International Conference on Advances in Computer-Human Interaction.

[12]  Saul Greenberg,et al.  Exploring true multi-user multimodal interaction over a digital table , 2008, DIS '08.

[13]  Thierry Gidel,et al.  The TATIN-PIC project: A multi-modal collaborative work environment for preliminary design , 2011, Proceedings of the 2011 15th International Conference on Computer Supported Cooperative Work in Design (CSCWD).

[14]  Daniel Wigdor,et al.  ACM International Conference on Interactive Tabletops and Surfaces, ITS 2009, Banff / Calgary, Alberta, Canada, November 23-25, 2009 , 2009, ITS.

[15]  Harald Reiterer,et al.  AffinityTable - A Hybrid Surface for Supporting Affinity Diagramming , 2011, INTERACT.

[16]  Dominik Schmidt,et al.  IdLenses: dynamic personal areas on shared surfaces , 2010, ITS '10.

[17]  Anthony Collins,et al.  Firestorm: a brainstorming application for collaborative group work at tabletops , 2011, ITS '11.

[18]  Darren Leigh,et al.  DiamondTouch: a multi-user touch technology , 2001, UIST '01.

[19]  Niklas Elmqvist,et al.  Hugin: a framework for awareness and coordination in mixed-presence collaborative information visualization , 2010, ITS '10.

[20]  Regan L. Mandryk,et al.  Display Factors Influencing Co-located Collaboration , 2002 .

[21]  Michael R. Genesereth,et al.  Software agents , 1994, CACM.

[22]  Christian Müller-Tomfelde Tabletops - Horizontal Interactive Displays , 2010, Human-Computer Interaction Series.

[23]  Clifton Forlines,et al.  MultiSpace: enabling electronic document micro-mobility in table-centric, multi-device environments , 2006, First IEEE International Workshop on Horizontal Interactive Human-Computer Systems (TABLETOP '06).

[24]  Thierry Gidel,et al.  CONDUCTING PRELIMINARY DESIGN AROUND AN INTERACTIVE TABLETOP , 2011 .