Lumisight table: a face-to-face collaboration support system that optimizes direction of projected information to each stakeholder

The goal of our research is to support cooperative work performed by stakeholders sitting around a table. To support such cooperation, various table-based systems with a shared electronic display on the tabletop have been developed. These systems, however, suffer the common problem of not recognizing shared information such as text and images equally because the orientation of their view angle is not favorable. To solve this problem, we propose the Lumisight Table. This is a system capable of displaying personalized information to each required direction on one horizontal screen simultaneously by multiplexing them and of capturing stakeholders' gestures to manipulate the information.

[1]  Daniel G. Bobrow,et al.  Beyond the chalkboard: computer support for collaboration and problem solving in meetings , 1988, CACM.

[2]  Jun Rekimoto,et al.  Augmented surfaces: a spatially continuous work space for hybrid computing environments , 1999, CHI '99.

[3]  Jun Rekimoto,et al.  A multiple device approach for supporting whiteboard-based interactions , 1998, CHI.

[4]  Michael Boyle,et al.  PDAs and shared public displays: Making personal information public, and public information personal , 1999, Personal Technologies.

[5]  Gerhard Fischer,et al.  Transcending the individual human mind—creating shared understanding through collaborative design , 2000, TCHI.

[6]  Norbert A. Streitz,et al.  An interactive Landscape for Creativity and Innovation , 1999 .

[7]  Ronald M. Baecker,et al.  Readings in Groupware and Computer-Supported Cooperative Work: Assisting Human-Human Collaboration , 1992 .

[8]  Clifton Forlines,et al.  Sharing and building digital group histories , 2002, CSCW '02.

[9]  Oscar de Bruijn,et al.  Serendipity within a Ubiquitous Computing Environment: A Case for Opportunistic Browsing , 2001, UbiComp.

[10]  Norbert A. Streitz,et al.  DOLPHIN: integrated meeting support across local and remote desktop environments and LiveBoards , 1994, CSCW '94.

[11]  Hiroshi Ishii,et al.  Sensetable: a wireless object tracking platform for tangible user interfaces , 2001, CHI.

[12]  M. Sheelagh T. Carpendale,et al.  How people use orientation on tables: comprehension, coordination and communication , 2003, GROUP '03.

[13]  Yoshifumi Kitamura,et al.  Interactive stereoscopic display for three or more users , 2001, SIGGRAPH.

[14]  Hal Eden,et al.  Getting in on the (inter)action: exploring affordances for collaborative learning in a context of informed participation , 2002, CSCL.

[15]  Takeshi Naemura,et al.  Lumisight table: interactive view-dependent display-table surrounded by multiple users , 2004, SIGGRAPH '04.

[16]  Norbert A. Streitz,et al.  i-LAND: an interactive landscape for creativity and innovation , 1999, CHI '99.

[17]  Judith S. Olson,et al.  Analysis of gestures in face-to-face design teams provides guidance for how to use groupware in design , 1995, Symposium on Designing Interactive Systems.

[18]  Regan L. Mandryk,et al.  System Guidelines for Co-located, Collaborative Work on a Tabletop Display , 2003, ECSCW.