Lumisight Table: an interactive view-dependent tabletop display

A novel tabletop display provides different images to different users surrounding the system. It can also capture users' gestures and physical objects on the tabletop. The Lumisight Table approach is based on the optical design of a special screen system composed of a building material called Lumisty and a Fresnel lens. The system combines these films and a lens with four projectors to display four different images, one for each user's view. In addition, we need appropriate input methods for this display media. In the current state of the project, we can control computers by placing physical objects on the display or placing our hands over the display. This screen system also makes it possible to use a camera to capture the appearance of the tabletop from inside of the system. Our other main idea is to develop attractive and specialized applications on the Lumisight Table, including games and applications for computer-supported cooperative-work (CSCW) environments. The projected images can be completely different from each other, or partially identical and partially different. Users can share the identical parts as public information, because all users can see it. This article is available with a short video documentary on CD-ROM.

[1]  M. Sheelagh T. Carpendale,et al.  How people use orientation on tables: comprehension, coordination and communication , 2003, GROUP '03.

[2]  Yoshifumi Kitamura,et al.  Interactive stereoscopic display for three or more users , 2001, SIGGRAPH.

[3]  Pierre Wellner The DigitalDesk calculator: tangible manipulation on a desk top display , 1991, UIST '91.

[4]  Mani B. Srivastava,et al.  System design of Smart Table , 2003, Proceedings of the First IEEE International Conference on Pervasive Computing and Communications, 2003. (PerCom 2003)..

[5]  Hiroshi Ishii,et al.  Sensetable: a wireless object tracking platform for tangible user interfaces , 2001, CHI.

[6]  Masanori Sugimoto,et al.  Caretta: a system for supporting face-to-face collaboration by integrating personal and shared spaces , 2004, CHI.

[7]  Jun Rekimoto,et al.  SmartSkin: an infrastructure for freehand manipulation on interactive surfaces , 2002, CHI.

[8]  Rahul Sukthankar,et al.  Smarter presentations: exploiting homography in camera-projector systems , 2001, Proceedings Eighth IEEE International Conference on Computer Vision. ICCV 2001.

[9]  Frédéric Vernier,et al.  Visualization techniques for circular tabletop interfaces , 2002, AVI '02.

[10]  Kori Inkpen Quinn,et al.  Single display privacyware: augmenting public displays with private information , 2001, CHI.

[11]  Darren Leigh,et al.  DiamondTouch: a multi-user touch technology , 2001, UIST '01.

[12]  Hiroshi Ishii,et al.  The metaDESK: models and prototypes for tangible user interfaces , 1997, UIST '97.

[13]  Hirokazu Kato,et al.  A city-planning system based on augmented reality with a tangible interface , 2003, The Second IEEE and ACM International Symposium on Mixed and Augmented Reality, 2003. Proceedings..

[14]  Takeshi Naemura,et al.  Lumisight table: a face-to-face collaboration support system that optimizes direction of projected information to each stakeholder , 2004, CSCW.

[15]  Yoichi Sato,et al.  Real-time tracking of multiple fingertips and gesture recognition for augmented desk interface systems , 2002, Proceedings of Fifth IEEE International Conference on Automatic Face Gesture Recognition.

[16]  Mike Wu,et al.  Multi-finger and whole hand gestural interaction techniques for multi-user tabletop displays , 2003, UIST '03.

[17]  Norbert A. Streitz,et al.  i-LAND: an interactive landscape for creativity and innovation , 1999, CHI '99.