Interaction Technologies for Large Displays - An Overview

Large Displays and visualizations on such display systems are becoming more and more popular. In order to interact with those systems new devices and interaction techniques must be developed in order to fit the specific needs. In this paper, an overview of the common state of the art devices and techniques will be presented and the pro and cons discussed.

[1]  Azriel Rosenfeld,et al.  Face recognition: A literature survey , 2003, CSUR.

[2]  Daniel Thalmann,et al.  The magic wand , 2003, SCCG '03.

[3]  Colin Ware,et al.  Using the bat: a six-dimensional mouse for object placement , 1988, IEEE Computer Graphics and Applications.

[4]  Patrick Baudisch,et al.  Mouse ether: accelerating the acquisition of targets across multi-monitor displays , 2004, CHI EA '04.

[5]  George W. Fitzmaurice,et al.  Spotlight: directing users' attention on large displays , 2005, CHI.

[6]  Andrew D. Wilson TouchLight: an imaging touch screen and display for gesture-based interaction , 2004, ICMI '04.

[7]  Shumin Zhai,et al.  What's in the eyes for attentive input , 2003, Commun. ACM.

[8]  Ivan Poupyrev,et al.  The go-go interaction technique: non-linear mapping for direct manipulation in VR , 1996, UIST '96.

[9]  Poika Isokoski,et al.  Gazing and frowning as a new human--computer interaction technique , 2004, TAP.

[10]  Andreas Kerren,et al.  Introduction to Human-Centered Visualization Environments , 2006, Human-Centered Visualization Environments.

[11]  Patrick Baudisch,et al.  High-Density Cursor: a Visualization Technique that Helps Users Keep Track of Fast-moving Mouse Cursors , 2003, INTERACT.

[12]  Thomas B. Moeslund,et al.  A Survey of Computer Vision-Based Human Motion Capture , 2001, Comput. Vis. Image Underst..

[13]  Steve Mann,et al.  OpenVIDIA: parallel GPU computer vision , 2005, ACM Multimedia.

[14]  Joseph J. LaViola,et al.  Hands-free multi-scale navigation in virtual environments , 2001, I3D '01.

[15]  Andrew T. Duchowski,et al.  Eye Tracking Methodology: Theory and Practice , 2003, Springer London.

[16]  Ravin Balakrishnan,et al.  VisionWand: interaction techniques for large displays using a passive wand tracked in 3D , 2004, SIGGRAPH 2004.

[17]  Marcelo Knörich Zuffo,et al.  On the usability of gesture interfaces in virtual reality environments , 2005, CLIHC '05.

[18]  Frederick P. Brooks,et al.  Moving objects in space: exploiting proprioception in virtual-environment interaction , 1997, SIGGRAPH.

[19]  Douglas A. Bowman,et al.  Interaction Techniques For Common Tasks In Immersive Virtual Environments - Design, Evaluation, And , 1999 .

[20]  Daniel Vogel,et al.  Interactive public ambient displays: transitioning from implicit to explicit, public to personal, interaction with multiple users , 2004, UIST '04.

[21]  Hans-Peter Seidel,et al.  Free-viewpoint video of human actors , 2003, ACM Trans. Graph..

[22]  Benjamin Watson,et al.  The ultimate display: where will all the pixels come from? , 2005, Computer.

[23]  Bernd Hamann,et al.  A practical system for laser pointer interaction on large displays , 2005, VRST '05.

[24]  Randy F. Pausch,et al.  Voodoo dolls: seamless interaction at multiple scales in virtual environments , 1999, SI3D.

[25]  W. Richard Fright,et al.  The Effects of Metals and Interfering Fields on Electromagnetic Trackers , 1998, Presence.

[26]  Desney S. Tan,et al.  The large-display user experience , 2005, IEEE Computer Graphics and Applications.

[27]  Daniel Vogel,et al.  Distant freehand pointing and clicking on very large, high resolution displays , 2005, UIST.

[28]  Norbert Wiener,et al.  Extrapolation, Interpolation, and Smoothing of Stationary Time Series , 1964 .

[29]  Randy Pausch,et al.  Virtual reality on a WIM: interactive worlds in miniature , 1995, CHI '95.

[30]  Oleg Spakov,et al.  Gaze-based selection of standard-size menu items , 2005, ICMI '05.

[31]  Zhiwei Zhu,et al.  Real-time eye detection and tracking under various light conditions , 2002, ETRA.

[32]  Abhishek Ranjan,et al.  Interacting with large displays from a distance with vision-tracked multi-finger gestural input , 2005, SIGGRAPH '06.

[33]  Shumin Zhai,et al.  Manual and gaze input cascaded (MAGIC) pointing , 1999, CHI '99.

[34]  C. Ware,et al.  An evaluation of an eye tracker as a device for computer input2 , 1987, CHI '87.

[35]  Dirk Heylen,et al.  Estimating the Gaze Point of a Student in a Driving Simulator , 2006 .

[36]  Wilson S. Geisler,et al.  Focusing on the essential , 2003, Commun. ACM.

[37]  Kelvin Cheng,et al.  Direct Interaction with Large-Scale Display Systems using Infrared Laser tracking Devices , 2003, InVis.au.

[38]  Thomas B. Moeslund,et al.  A Natural Interface to a Virtual Environment through Computer Vision-Estimated Pointing Gestures , 2001, Gesture Workshop.

[39]  Andrew S. Forsberg,et al.  Image plane interaction techniques in 3D immersive environments , 1997, SI3D.

[40]  Philip R. Cohen,et al.  Multimodal speech-gesture interface for handfree painting on a virtual paper using partial recurrent neural networks as gesture recognizer , 2002, Proceedings of the 2002 International Joint Conference on Neural Networks. IJCNN'02 (Cat. No.02CH37290).

[41]  Shumin Zhai,et al.  User performance in relation to 3D input device design , 1998, COMG.

[42]  Laura Chamberlain Eye Tracking Methodology; Theory and Practice , 2007 .

[43]  George V. Moustakides,et al.  Stabilizing the fast Kalman algorithms , 1989, IEEE Trans. Acoust. Speech Signal Process..

[44]  Rainer Stiefelhagen,et al.  Pointing gesture recognition based on 3D-tracking of face, hands and head orientation , 2003, ICMI '03.