Face-to-Face Collaborative Interfaces

Publisher Summary This chapter provides an introduction to research and developments in multitouch input technologies that can be used to realize large interactive tabletop or “surface user interfaces.” Such hardware systems, along with supporting software, allow for applications that can be controlled through direct touch or multitouch. Further, a review of gestural interactions and design guidelines for surface user interface design for collaboration is also provided. Multitouch surface user interfaces (SUIs) can be mounted vertically on walls or horizontally on tables. They are capable of sensing the location of finger(s) when contact with the surface is made. SUIs are used in public places (kiosks, ATMs) or in small personal devices (PDAs, iPhones) where a separate keyboard and mouse cannot or should not be used. Basic SUIs have been common for over 20 years in the form of interactive kiosks, ATMs, and point-of-sale systems, which rely on touch-screen technology with simple button interfaces. The current generation of SUIs suitable for face-to-face interaction are built on LCD displays or form-factored into walls or coffee tables. In their current form they cannot be considered a basic object. However, display technologies are now ubiquitous, and if SUI interaction styles can be woven into the environments and activities of everyday life and their industrial design improved, invisibility in action can be achieved. The ultimate goal of surface user interfaces in collaborative face-to-face activities is for people not to feel they are using a computer; instead, the visual elements should naturally support their actions. Ultimately, SUIs will become so commonplace in everyday life that no one will notice their presence. They will be aesthetic, powerful, and enhance our lives but so too will they be commonplace, obvious, and boring.

[1]  Trent Apted,et al.  Tabletop sharing of digital photographs for the elderly , 2006, CHI.

[2]  Y. Guiard Asymmetric division of labor in human skilled bimanual action: the kinematic chain as a model. , 1987, Journal of motor behavior.

[3]  Aaron J. Quigley,et al.  Situvis: A Visual Tool for Modeling a User's Behaviour Patterns in a Pervasive Environment , 2009, Pervasive.

[4]  Mark Weiser,et al.  The computer for the 21st Century , 1991, IEEE Pervasive Computing.

[5]  Chris North,et al.  The Perceptual Scalability of Visualization , 2006, IEEE Transactions on Visualization and Computer Graphics.

[6]  M. Sheelagh T. Carpendale,et al.  Interactive Tree Comparison for Co-located Collaborative Information Visualization , 2007, IEEE Transactions on Visualization and Computer Graphics.

[7]  John C. Tang,et al.  Liveboard: a large interactive display supporting group meetings, presentations, and remote collaboration , 1992, CHI.

[8]  Hao Jiang,et al.  System design for the WeSpace: Linking personal devices to a table-centered multi-user, multi-surface environment , 2008, 2008 3rd IEEE International Workshop on Horizontal Interactive Human Computer Systems.

[9]  W. Buxton,et al.  A study in two-handed input , 1986, CHI '86.

[10]  Carl Gutwin,et al.  Task analysis for groupware usability evaluation: Modeling shared-workspace tasks with the mechanics of collaboration , 2003, TCHI.

[11]  Bob Kummerfeld,et al.  Merino: Towards an intelligent environment architecture for multi-granualarity context description , 2003 .

[12]  Mike Wu,et al.  Multi-finger and whole hand gestural interaction techniques for multi-user tabletop displays , 2003, UIST '03.

[13]  Luca Benini,et al.  Introducing tangerine: a tangible interactive natural environment , 2007, ACM Multimedia.

[14]  Andrew D. Wilson TouchLight: an imaging touch screen and display for gesture-based interaction , 2004, ICMI '04.

[15]  Andreas Paepcke,et al.  Cooperative gestures: multi-user gestural interactions for co-located groupware , 2006, CHI.

[16]  Ivan Poupyrev,et al.  Tactile interfaces for small touch screens , 2003, UIST '03.

[17]  Ravin Balakrishnan,et al.  Symmetric bimanual interaction , 2000, CHI.

[18]  Myron W. Krueger,et al.  VIDEOPLACE—an artificial reality , 1985, CHI '85.

[19]  William Buxton,et al.  Issues and techniques in touch-sensitive tablet input , 1985, SIGGRAPH '85.

[20]  Saul Greenberg,et al.  GSI demo: multiuser gesture/speech interaction over digital tables by wrapping single user applications , 2006, ICMI '06.

[21]  Mark Weiser The computer for the 21st century , 1991 .

[22]  Hiroshi Ishii,et al.  Tangible bits: towards seamless interfaces between people, bits and atoms , 1997, CHI.

[23]  Daniel J. Wigdor,et al.  Direct-touch vs. mouse input for tabletop displays , 2007, CHI.

[24]  Andrew D. Wilson PlayAnywhere: a compact interactive tabletop projection-vision system , 2005, UIST.

[25]  Alan Esenther,et al.  Fluid DTMouse: better mouse support for touch-based interactions , 2006, AVI '06.

[26]  Martin Wattenberg,et al.  Communication-minded visualization : A call to action , 2006 .

[27]  Anind K. Dey,et al.  a CAPpella: programming by demonstration of context-aware applications , 2004, CHI.

[28]  Lloyd H. Nakatani,et al.  Soft machines: A philosophy of user-computer interface design , 1983, CHI '83.

[29]  Regan L. Mandryk,et al.  System Guidelines for Co-located, Collaborative Work on a Tabletop Display , 2003, ECSCW.

[30]  Alan J. Dix,et al.  A taxonomy for and analysis of multi-person-display ecosystems , 2009, Personal and Ubiquitous Computing.

[31]  Jun Rekimoto,et al.  HoloWall: designing a finger, hand, body, and object sensitive wall , 1997, UIST '97.

[32]  Hiroshi Ishii,et al.  The metaDESK: models and prototypes for tangible user interfaces , 1997, UIST '97.

[33]  Ben Shneiderman,et al.  High Precision Touchscreens: Design Strategies and Comparisons with a Mouse , 1991, Int. J. Man Mach. Stud..

[34]  Martin Wattenberg,et al.  Technical forum: Communication-minded visualization: A call to action , 2006, IBM Syst. J..

[35]  Hiroshi Ishii,et al.  Bricks: laying the foundations for graspable user interfaces , 1995, CHI '95.

[36]  Renaud Blanch,et al.  Semantic pointing: improving target acquisition with control-display ratio adaptation , 2004, CHI.

[37]  Hiroshi Ishii,et al.  Audiopad: A Tag-based Interface for Musical Performance , 2002, NIME.

[38]  Darren Leigh,et al.  DiamondTouch: a multi-user touch technology , 2001, UIST '01.

[39]  Pierre David Wellner,et al.  Interacting with paper on the DigitalDesk , 1993, CACM.

[40]  Hiroshi Ishii,et al.  Emerging frameworks for tangible user interfaces , 2000, IBM Syst. J..

[41]  Shumin Zhai,et al.  High precision touch screen interaction , 2003, CHI '03.

[42]  Jiajie Zhang,et al.  Representations in Distributed Cognitive Tasks , 1994, Cogn. Sci..

[43]  Jun Rekimoto,et al.  SmartSkin: an infrastructure for freehand manipulation on interactive surfaces , 2002, CHI.

[44]  Meredith Ringel Morris,et al.  DiamondSpin: an extensible toolkit for around-the-table interaction , 2004, CHI.