Group coordination and negotiation through spatial proximity regions around mobile devices on augmented tabletops

Negotiation and coordination of activities involving a number of people can be a difficult and time-consuming process, even when all participants are collocated. We propose the use of spatial proximity regions around mobile devices on a table to significantly reduce the effort of proposing and exploring content within a group of collocated people. In order to determine the location of devices on ordinary tables, we developed a tracking mechanism for a camera-projector system that uses dynamic visual markers displayed on the screen of a device. We evaluated our spatial proximity region based approach using a photo-sharing application for people sat around a table. The tabletop provides a frame of reference in which the spatial arrangement of devices signals the coordination state to the users. The results from the study indicate that the proposed approach facilitates coordination in several ways, for example, by allowing for simultaneous user activity and by reducing the effort required to achieve a common goal. Our approach reduced the task completion time by 43% and was rated as superior in comparison to other established techniques.

[1]  Andrew D. Wilson,et al.  BlueTable: connecting wireless mobile devices on interactive surfaces using vision-based handshaking , 2007, GI '07.

[2]  Sergi Jordà,et al.  The reacTable: a tangible tabletop musical instrument and collaborative workbench , 2006, SIGGRAPH '06.

[3]  Kenton O'Hara,et al.  Social coordination around a situated display appliance , 2003, CHI '03.

[4]  Gerd Kortuem,et al.  Sensing and visualizing spatial relations of mobile devices , 2005, UIST.

[5]  Leysia Palen,et al.  Social, individual and technological issues for groupware calendar systems , 1999, CHI '99.

[6]  M. Sheelagh T. Carpendale,et al.  Interface Currents: Supporting Fluent Collaboration on Tabletop Displays , 2005, Smart Graphics.

[7]  Ross Bencina,et al.  reacTIVision: a computer-vision framework for table-based tangible interaction , 2007, TEI.

[8]  Anthony G. Cohn,et al.  Calculi for Qualitative Spatial Reasoning , 1996, AISMC.

[9]  Gerd Kortuem,et al.  A relative positioning system for co-located mobile devices , 2005, MobiSys '05.

[10]  James R. Lewis,et al.  IBM computer usability satisfaction questionnaires: Psychometric evaluation and instructions for use , 1995, Int. J. Hum. Comput. Interact..

[11]  Jun Rekimoto,et al.  SyncTap: An Interaction Technique for Mobile Networking , 2003, Mobile HCI.

[12]  Michael Rohs,et al.  The smart phone: a ubiquitous input device , 2006, IEEE Pervasive Computing.

[13]  H. Gellersen,et al.  A Relative Positioning System for Spatial Awareness of Co-located Mobile Devices and Users , 2004 .

[14]  George W. Fitzmaurice,et al.  Situated information spaces and spatially aware palmtop computers , 1993, CACM.

[15]  Meredith Ringel Morris,et al.  DiamondSpin: an extensible toolkit for around-the-table interaction , 2004, CHI.

[16]  Diego López-de-Ipiña,et al.  TRIP: A Low-Cost Vision-Based Location System for Ubiquitous Computing , 2002, Personal and Ubiquitous Computing.

[17]  Regan L. Mandryk,et al.  System Guidelines for Co-located, Collaborative Work on a Tabletop Display , 2003, ECSCW.

[18]  M. Sheelagh T. Carpendale,et al.  How people use orientation on tables: comprehension, coordination and communication , 2003, GROUP '03.

[19]  Reinhard Moratz,et al.  Exploiting Qualitative Spatial Neighborhoods in the Situation Calculus , 2004, Spatial Cognition.

[20]  Hirokazu Kato,et al.  Collaborative augmented reality , 2002, CACM.